• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
I believe that too. The 70s are high performance mid range cards. The 80s are high end cards while the Tis and 90s are the enthusiasts ones.
I think it's been like that since....always. Just the differences between the tiers increase and reduce depending on the die/tech/gen etc.
That's what Nvidia wants you to believe.

The prices may be ridiculous but they don't dictate the tiers.
Yes they do. Money dictates everything, that's just the sad reality of capitalism.

The same amount of money buys you the same amount of performance in the same performance class. For example, 1080p gaming* was achievable on a 350 USD card 10 years ago, and it is achievable on a 350 USD card now. Back then, it was called a Radeon HD 7870, now it's called an RX 6600 XT. That's two numbers down the tier list, so you might think you're getting a lesser product when you're not. The 6600 XT gives you exactly the same gameplay experience in modern games at exactly the same resolution for exactly the same price as the 7870 did 10 years ago. Product classification based on name alone is a psychological thing.

* I'm making this comparison because 1080p monitors weren't expensive back then, and they are even cheaper now.
 
Last edited:
Exactly, that's why I'm not buying one. All I said it's a high-end product. I never said it was a good one. ;)
Price determines who a product is targeted at, not its name, or VRAM amount. This is how capitalism works.
Look man, my argument is that the 4070s are 1440p products, because they fall in the middle of the current generation in terms of their performance capabilities, and as such are correctly designed with 12GB VRAM as that's a good amount for a 1440p frame buffer in the current generation.
But the moment you and some others see the phrase "mid-range" you immediately go to "price range" and the conversation goes to $hit.
Yes, pricing sucks and yes this might be the first time ever I upgrade within the same generation (6700XT to 6950XT), because pricing makes the new one irrelevant in terms of price/performance.
But that doesn't change any of the above ^^
 
Look man, my argument is that the 4070s are 1440p products, because they fall in the middle of the current generation in terms of their performance capabilities, and as such are correctly designed with 12GB VRAM as that's a good amount for a 1440p frame buffer in the current generation.
But the moment you and some others see the phrase "mid-range" you immediately go to "price range" and the conversation goes to $hit.
I get that, but it's a wee bit expensive for a 1440p card, don't you think? (Not to mention the fact that even you admitted that it can do 4K.)

Edit: Let's also not forget about the fact that Nvidia was planning to release the now 4070 Ti as the 4080 12 GB. ;)

Yes, pricing sucks and yes this might be the first time ever I upgrade within the same generation (6700XT to 6950XT), because pricing makes the new one irrelevant in terms of price/performance.
I did that with a 6500 XT (it's happily chugging along in my HTPC now) to a 6750 XT, so good luck! :toast:
 
That's what Nvidia wants you to believe.


Yes they do. Money dictates everything, that's just the sad reality of capitalism.

The same amount of money buys you the same amount of performance in the same performance class. For example, 1080p gaming* was achievable on a 350 USD card 10 years ago, and it is achievable on a 350 USD card now. Back then, it was called a Radeon HD 7870, now it's called an RX 6600 XT. That's two numbers down the tier list, so you might think you're getting a lesser product when you're not. The 6600 XT gives you exactly the same gameplay experience in modern games at exactly the same resolution for exactly the same price as the 7870 did 10 years ago. Product classification based on name alone is a psychological thing.

* I'm making this comparison because 1080p monitors weren't expensive back then, and they are even cheaper now.

Buying power of £350 in 2013​

This chart shows a calculation of buying power equivalence for £350 in 2013 (price index tracking began in 1750).

For example, if you started with £350, you would need to end with £513.19 in order to "adjust" for inflation (sometimes refered to as "beating inflation").

Even without any investment from nVidia and profit after that, the 7870 costs 513p today.
You can calculate how much the 7970 would cost today.

So, the price does not make the 7870 a high end card. It's still an entry level GPU.
 
Well there's something really wrong with that info, because it was nothing like this when I played it nor it is in that screenshot I added....
Here is my ss / 4k native ultra

4kultra.JPG.6b0edd4a51fb0b88be5f90b86d034bf2.JPG
 
Even without any investment from nVidia and profit after that, the 7870 costs 513p today.
You can calculate how much the 7970 would cost today.

So, the price does not make the 7870 a high end card. It's still an entry level GPU.
If 513 quid (or even 350 for that matter) is entry-level to you, then I have nothing more to say.
 
Here is my ss / 4k native ultra

4kultra.JPG.6b0edd4a51fb0b88be5f90b86d034bf2.JPG
I see. Take a reading in the market or any of the other locations with a significant number of NPCs and assets. I bet that you're gonna hit the 8 gigs immediately.
 
I see. Take a reading in the market or any of the other locations with a significant number of NPCs and assets. I bet that you're gonna hit the 8 gigs immediately.
The market is the heaviest part of the game most likely and even then it consumes 6.4 GB. That's on a 4090 with plenty of vram to stretch, I bet on a card with lower vram it would show even less. The best part is the more you play the vram usage drops.

The only conclusion I can draw from all of this is that if you wanna play a game that appears on this list


The game is going to look way worse than Plague Tale and will consume double or triple the vram.
 
That's on a 4090 with plenty of vram to stretch, I bet on a card with lower vram it would show even less.
Yeah, that's not how it works. But still it's strange that a 3070 shows 8gb buffer full and you don't see that.
Also peak VRAM use in a game is what determines it's VRAM requirements, so the market being "the heaviest" is not an excuse to claim the game will work on 4GB cards with ultra settings.
And no it's not about sponsored titles either. UE4 titles are generally Nvidia-optimized games and still Hogwarts and Callisto eat VRAM for breakfast.
Plague Tale is an excellent engine btw, but sadly it's very unlikely we'll see anyone but Asobo using it.
 
Yeah, that's not how it works. But still it's strange that a 3070 shows 8gb buffer full and you don't see that.
Also peak VRAM use in a game is what determines it's VRAM requirements, so the market being "the heaviest" is not an excuse to claim the game will work on 4GB cards with ultra settings.
And no it's not about sponsored titles either. UE4 titles are generally Nvidia-optimized games and still Hogwarts and Callisto eat VRAM for breakfast.
Plague Tale is an excellent engine btw, but sadly it's very unlikely we'll see anyone but Asobo using it.
 
Yeah, that's not how it works. But still it's strange that a 3070 shows 8gb buffer full and you don't see that.
Also peak VRAM use in a game is what determines it's VRAM requirements, so the market being "the heaviest" is not an excuse to claim the game will work on 4GB cards with ultra settings.
And no it's not about sponsored titles either. UE4 titles are generally Nvidia-optimized games and still Hogwarts and Callisto eat VRAM for breakfast.
Plague Tale is an excellent engine btw, but sadly it's very unlikely we'll see anyone but Asobo using it.
I didn't say it works on 4gb cards, I said 8gb are plenty enough for 4k ultra on games that are not on amds sponsored list. If they are there, not only will they look crap, they will also hog vram like crazy.
 
I didn't say it works on 4gb cards, I said 8gb are plenty enough for 4k ultra on games that are not on amds sponsored list. If they are there, not only will they look crap, they will also hog vram like crazy.
"If you want to play the best looking games this gen has to offer then 4-5gb vram is plenty even for 4k Ultra. For example, plague tale"
 
"If you want to play the best looking games this gen has to offer then 4-5gb vram is plenty even for 4k Ultra. For example, plague tale"
So I didn't say it works on 4gb cards.. Thanks for agreeing
 
Unless you use RT and ultra textures, 3070 is sufficient for most games, especially using dlss

The recent HWUB video demonstrates that even with DLSS and at 1080p you have to knock settings all the way down to medium on a game like TLOU in order to avoid stutters. Forget high and ultra settings.

DLSS is not a significant saving grace for the 3070 or other 8GB cards either, most of these newer games coming out do not get the bulk of their memory savings from reduced screen resolution. I am not sure why this idea continues to persist but screen resolution hasn't been the major determining factor in VRAM usage since 2017. You are saving 1GB by going from 4K to 1080p. DLSS is going to save you a quarter of that at best, assuming there isn't a memory overhead associated with in, in which case it might not save you anything at all. In any case it's not going to be much of a help for 8GB graphics cards that lack VRAM. It doesn't matter if DLSS enables the cores of the 3070 to processes frames faster when it's 8GB of VRAM is going to be the bottleneck.

I don't think it can be understated just how early we are seeing this level of poor performance from the 3070 either, even the 3GB 1060 got another year of use before games started performing poorly on it. Things can only get worse from here on 8GB cards and it does not look good.

One issue is that certain games using too much vram for no good reason. Last of us and Hogwarts are prime examples where simple ini-tweaks can reduce vram-usage by a massive amount without imagequality dropping. Running stock at Hogwarts 1080p at high-ish settings it uses 7.5gb vram, if I set framebuffer to 3072mb in ini-file I see no reduction in texturequality, but usage drops to 5.5gb. Going to 2048mb did not affect textures either on my setup. At 4k you would probably get issues unless running dlss.

We've seen a handful of new titles using a lot of VRAM to the point where it's forming a trend for AAA titles. This doesn't appear to be poorly optimized games, in fact if you look at the TLOU frame chart the FPS is rock stable when you have enough VRAM and plays well. The game requires a lot of VRAM but appears to be well optimized.

Ini tweaks can reduce VRAM usage but you can't categorically say they come with no trade-offs. Like in other games with Ini tweaks like Skyrim, there are almost always trade-offs. You may gain lower VRAM usage in one had but increase texture swapping or crashing in another. Until there is an objective article or video on the tweaks, it's hard to say the definitive impact of certain tweaks.

Ini tweaks also make it an apples to oranges comparison as well, you are comparing an out of the box experience to a modified experience. Aside from the fact that most people are not going to bother with Ini tweaks nor should they have to, any Ini tweaks should be treated as a side point and not a cure for what is a lack of VRAM. I also don't think many PC owners want to operate on the hope that a game has INI tweaks in order to get a decent experience or even the idea that they'll jerry rig every new AAA title. If a $500 consoles does not have to do it, your $1,000+ PC should not have to either.

Indeed, and consoles don't have much more than that available for textures targeting 1440p to 4k anyway. I'd like to see more effort made with ports.

If they're going to make a pc release, they should put effort into said release to make the experience smooth. Bare minimum with the $$ they charge.

The PS5 has 16GB of unified memory, which means the CPU and GPU can access the whole amount. It also has 512 MB separate just for the OS. I can imagine that games are targeting 14-15GB for the PS5.

The consoles also has a dedicated hardware decompression chip, so they can in fact get away with lower VRAM usage as disk access is greatly accelerated as compared to PC.

Pc has those tricks too, devs don't seem to want to code for it. And what's available for textures isn't drastically more than 8gb. Again, at the prices charged for these pc releases, for a game where all the content is already developed, they should be putting more effort into optimisation, inclusion of direct storage etc.

No, PCs do not have a dedicated decompression chip. There are APIs like direct storage but it is not the same as physical hardware. Nvidia has RTX IO built on direct storage but it utilizes generalized GPU hardware and not a dedicated chip.

8Gb = 1080p
12Gb = 1440p
16Gb = 4K
Broadly speaking, obviously. It depends on the games.
I honestly wouldn't be so on this VRAM starvation issue if it weren't for the pricing. AMD sells a 12Gb 6700 xt for 400€? Jolly good. Nvidia sells a 12Gb 4070 Ti for 1000€? Eat a dick.
It's a 4K capable card with the VRAM for 1440p. And what we've been seeing since a few weeks is only the first games of the PS5 era. I wouldn't be surprised at all if even at 1440p, by 2024, we start having people needing to lower textures or raytracing because the VRAM just can't bear it.

1000 dollars for a 1440p RT capable card "if you lower the textures". FU too, Nvidia.

The difference between 4K and 1080p in modern games is typically 1GB.

A much bigger factor is settings. You chart should look like this:

8GB - Medium, 1080p to 1440p
12GB - High, 1080p to 1440p
16GB - Very High, 1080p to 4K
24GB - Ultra, 1080p to 4K
 
If the devs or company or whoever decide to release the game to PC, you bet I expect work put into a good PC release, or they won't get my money, or a lot of other peoples. They may be a small handful compared to consoles, but if you do a PC release, PC gamers have expectations.

These recent garbage releases are more a reflection on craptacular ports, fanboyism and tech media than it is on Nvidia, while I also agree Nvidia has been light on VRAM. It's pretty easy to see both sides.

It's a quote from a user on this forum from... before your time, but please, read a random quote in my sig and the card I own and draw whatever conclusion you like, you always do anyway, irrespective of my comments and shared experience with how that's treating me. A lot of memes come to mind with you too, got a point? or rather drop it and stay on topic, I would.
----------------------------------------------------
I absolutely expect that for the next 2-3 years easy that PC game releases cater to 8GB VRAM (and less) gamers, sue me, quote me, you do you.
Sure buddy, its just so hard to discover the logic that happens in your mind all the time. Bias is a thing and its clear to see. And its funny as hell. I'll indeed do me. I do it everywhere to everyone and also to myself, its called reflecting upon matters. Oh and sure, I'm a total ass like that sometimes, or not, depending on what your own stance is.

I'll also be the first to tell you I was wrong, but the fact of the matter is, I'm really not and have virtually never been wrt VRAM, or market developments. I also do reflect on that and if proven wrong, I swallow my tongue. But as long as I read your drivel everywhere about what is supposedly not important and why, I will respond to it with reality checks. And arguments - real ones, like the ones I typed up earlier, supported by examples and history.
 
The difference between 4K and 1080p in modern games is typically 1GB.

A much bigger factor is settings. You chart should look like this:

8GB - Medium, 1080p to 1440p
12GB - High, 1080p to 1440p
16GB - Very High, 1080p to 4K
24GB - Ultra, 1080p to 4K
Aye, it's a simplification.
In the first place people don't target their GPU expenditures based on quality tiers but on resolution. I'm just making a really simple version of things that applies to buyers.
Obviously if we're counting real metrics, it's more of a low-medium-high-ultra-ultra and RT and even then there's tons of little knobs to turn on or off in games to make a game more palatable to your GPU.
Still, most people will just buy and run on Ultra and click High if that's not enough, and then Medium, and Low, until they find 60 FPS again.
 
I don't disagree, but my point is that there's a difference between the game swapping data while still delivering a smooth frame rate, and the game swapping data leading to stutters. Your gameplay experience matters, not your VRAM usage, or allocation, or whatever you want to call it. You're playing a game, after all, not benchmarking.
Yeah, I guess we're kinda saying the same thing differently. Thing is, if your bandwidth can't provide for the texture you need that didn't fit within allocated VRAM, you get that stutter we both speak of. So in what way is allocation not usage, especially if you look at the bandwidth of Nvidia cards now below the 4080; if the cache doesn't cover your game or use case proper, you're royally F'ed.
 
The recent HWUB video demonstrates that even with DLSS and at 1080p you have to knock settings all the way down to medium on a game like TLOU in order to avoid stutters. Forget high and ultra settings.
I find Hardware Unboxed to be just a tad alarmist about the VRAM thing. It is their third video (that I recall of) where they talk about this.

I consider The Last of US the only game where the RTX 3070 struggles due to lack of VRAM, at least according to that video.
All other games either ran fine on the RTX 3070 or had Ray Tracing enabled, which I'll call unfair and/or impractical. Neither the RTX 3070, nor the RX 6800 is a good card to get if you are serious about Ray Tracing, regardless how much VRAM they have, so enabling it exacerbates the issue artificially.

Yes, the issue exists. Yes, 8GB of VRAM is indeed slowly becoming insufficient for people who want to play the latest AAA games. Yes, NVIDIA is stingy with its VRAM.
My point is that the entire video makes the issue appear worse than it is.
 
the 3070 matching or even beating the 6800 even in the 0.1% lows on newer titles at 4K max settings, including Hogwarths

 
Bias is a thing and its clear to see. And its funny as hell.
Right back at ya.
But as long as I read your drivel everywhere about what is supposedly not important and why,
Again, right back at you, plus the 'you doing you' bringing in other factors beyond what I'm actually saying and applying it against my statements, that's the funniest part, to see your arguments fall apart at a fundamental level over and over when you argue so poorly. I'll happily admit when wrong, and when any company does a bad thing, perhaps the key is in reading what I wrote carefully and just addressing that.
No, PCs do not have a dedicated decompression chip.
I must be thinking of the series X then, not the PS5 specifically which I didn't mention by name, just 'consoles' - but a valid point made by you no less.
I find Hardware Unboxed to be just a tad alarmist about the VRAM thing. It is their third video (that I recall of) where they talk about this.
Yes, the issue exists. Yes, 8GB of VRAM is indeed slowly becoming insufficient for people who want to play the latest AAA games. Yes, NVIDIA is stingy with its VRAM.
My point is that the entire video makes the issue appear worse than it is.
Steve from HUB did a great job making the video he set out to make. He intended from the get-go to showcase 8GB of VRAM falling over, and he achieved just that. I'm also not saying he doesn't have a point, but the video showcased this "I'll prove myself right by constructing the test to prove myself right" clear as day. Likewise it would be possible to construct the testing methodology and game samples to do the inverse and cripple the 6800XT and have the 3070 come off much better, but the 6800XT's weakness is well known at this point, and VRAM is the topic du jour.
 
How much do you think is the minimum amount of vram for gaming in 2023 and onwards?
12GB vram is the new 8GB vram
 
No, it is not "in different veins" - hardware demands in games increase as time passes, thus what was great hardware wise 7 years ago is barely cutting it today. I really don't see what's so hard to understand about it.

And while there are always rotten apples when it comes to coding (always have been, always will be), alot of people are making it out as if the increase in vram demands in alot of recent AAA titles is purely down poor optimization, rather than coming to the realization that the games are simply being developped for the new consoles first and foremost, and that the hardware demands reflects that.

But sure, be my guest and continue to just blame it on coding, while giving nvidia a free pass on skimping on vram with their planned obsolescence strategy.

I think maybe your reading comprehension needs to be tested. Let me go back to an example that I quoted, and let's look farther into it.

Warframe is the game Digital Extremes released in the last decade. Before that they worked don Unreal...and a handful of other games. Now that we've established their pedigree, let me suggest Warframe was played on an old core 2 quad, running windows xp x32. Literally the same game is running now, and over time its requirements have increased. You will, in a short while, require a 64 bit OS, a card capable of DX11, and a processor made in the last decade. It has the options that most AAA games tout as visually appealing. All of these features absolutely chew into RAM and VRAM, and you can set them to the moon at 4k and get a 470 to cry and give-up...but even a 470 can run it today at 60 Hz 1080p. It runs without the idiotic extras on a card that is...4 or 5 generations out of date...I'm trying to remember here, so I might be off.

Now, your premise was and is that because games are getting bigger they need more. I am cool with that when the features are genuinely improving. Warframe can adequately demonstrate this, and I think the above makes it clear that more requires more is pretty obvious of a conclusion.


Now...let me also hammer home how bad DE sucks at coding. I'm hammering them because it also highlights how bad code can increase requirements without any improvements. Their DX12 renderer is listed as optional, and you can select it from the launch window. It's been experimental for years now...and it's a rollercoaster. One mainline gives us the DX12 renderer pumping out 1080p 60Hz without issue, then the next they decide to turn on filters for wet surfaces and certain areas are 30-40 Hz (using Hz as a rough analog for FPS). Literally the only listed change is implementation of code for wet surfaces...and it took two more patches to first get performance back to 50+ Hz, and then get to 60 Hz. This demonstrates that garbage code sunk performance when a new feature was implemented, code was patched, and patched again, and we got a feature that added new stuff while demonstrating it didn't need much as far as extra resources, but could absolutely tank performance.


So...why do I still think you're wrong? Basic logic. Games are shipped as a beta, we pay for them, and if they're successful they get updates to make them better. That's basic practices for consumer software...and if you think otherwise provide a single example of a recent AAA or even AA release that doesn't have at least 2 patches before it leaves the 2 month profitability window.
Other companies that have demonstrated that poor optimization is the source of many high overhead experiences include Warner Brothers (Batman), Epic (Fortnite and the game corrupting skin purchase), CD Projekt Red (Cyberpunk), and basically everything else.



Thing is, I see other people stating that this is port syndrome. The PC doesn't matter, so it gets the junior or outsourced dev team to run it. I think that's true to come extent, but it's not an excuse. It, putting this in terms of cars, is like installing a racing engine on a geo metro to get it to go 60 mph because the drive train you decided to buy is about 7% efficient instead of the 15% of most options. That's not an excusable situation, even if I understand why. What it is, is coping out to doing a good job to meet fiscal targets and putting off doing well until later...if at all.
I say this looking at my Steam library.
Turok - N64 - 0 patches from Nightdive and none possible on the original hardware
Red Faction II - 0 patches for the same reason as Turok
Psychonauts - Not updated by Doublefine and running like a champ
Painkiller - Don't remember updates, but there is an HD rerelease with changes
Bioshock 1/2 - Updates, HD rerelease, and still beautiful today

Hmmm... Seems like this (highly curated) selection of games indicates the requirement to release good because you can't update, combined with the selective choices to make the art style more important than the latest ray tracing garbage can create games a million times more visually enthralling than the latest idiotic semi-open world with the best bells and whistles...and AAA/hardware vendors hope we don't understand this and accept the increased requirements without question to both decrease QA investment and sell the "need" for upgrades.


If you don't agree I'm fine this that. How is the Tress-FX going for you? You remember, the unique tech that made hair seem more real but cost a huge amount of performance. That tech that people immediately disabled, because watching a young reboot of Laura Croft get brutally impaled in a cutscene was so much more viscerally memorable than the fact that her hair was slightly more life like while ganking a bunch of hardened mercenaries as a near teenage girl who sopping wet might tip the scales at 90 pounds. Yeah. Most people kinda tend not to need the goofy bells and whistles...which is why we're debating something like minimum VRAM requirements. Programming is hard, choosing substance over ray tracing is harder, but most difficult of all is coming to the realization that poor code drives most of the perceived need for new hardware. Not all, but most. This is directly because people buy games at release, play to completion, and by the time they're patched good they are already well into the next buggy launch.
Care to argue, then my examples are going to be simple. Anything Bethesda, anything Ubisoft, Halo of all things, Cyberpunk, and Atomic Hearts.
A little on that last one. In 2023 it released without an FOV slider... That's not next gen features...Steam Update List...this is basic features that we should be able to alter. If you can't support basic features on day one, and have to patch it in almost a month later, then maybe it should be obvious that you weren't really ready to ship...or your MVP is so low specification that any bug is tolerable so long as you can still boot the game without a crash...even if it performs like a visual pile of hot garbage.
 
Steve from HUB did a great job making the video he set out to make. He intended from the get-go to showcase 8GB of VRAM falling over, and he achieved just that. I'm also not saying he doesn't have a point, but the video showcased this "I'll prove myself right by constructing the test to prove myself right" clear as day. Likewise it would be possible to construct the testing methodology and game samples to do the inverse and cripple the 6800XT and have the 3070 come off much better, but the 6800XT's weakness is well known at this point, and VRAM is the topic du jour.
Generally speaking, a lot of (if not all) Youtube reviewers fall into some typical money grabbing mistakes these days, be it intentional or just a coincidence. Namely:
1. Making a video after someone else's video without properly researching the given topic,
2. Starting an investigation with a premise that the investigation is set up to prove no matter what,
3. Glorifying / shitting on a product to cater to public opinion.

It's all clickbait nonsense. That's why I try to buy and sell as much stuff as I can without too big of a loss, so I can see what's what first-hand. Many times I've seen that a generally hated product isn't actually that bad, or that a generally loved one has non-mentioned flaws that are a dealbreaker to me. My HTPC has got a Core i7-11700 and a 6500 XT in it. Both got an extreme amount of shite from Youtube reviewers, and here I am, loving both of them to bits. I even used that system as my main PC for a while before I built the one I have now, and I was happy with it.

Other than that, TPU is one of the few reliable sources left on the internet. I don't trust Youtube at all, not even the big names.
 
Last edited:
Status
Not open for further replies.
Back
Top