• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are the 8 GB cards worth it?

Would you use an 8GB graphics card, and why?

  • No, 8 GB is not enough in 2025

  • Yes, because I think I can play low settings at 1080p

  • I will explain in the comments section


Results are only viewable after voting.
Status
Not open for further replies.
For what its worth, I saw a pretty big bump from 3070Ti to 4070Ti. My GPU is strong enough for the most part, but can get a bit tricky with some games at 4K. Gotta use that cheater mode :)
Exactly. Tinkering and tweaking are a part of PC gaming, and PC's in general. It all about trade off.

Let me elaborate a bit for others: just because people earn the money needed to buy a 5090, it doesn't mean they can afford one. There's a million and one other, more important expenses in life than a graphics card.
Also exactly spot on. Housing, utilities, food, transportation, children etc. Just because people make a lot of money doesn't mean they can spend it on a single PC part.
 
Last edited:
View attachment 397880
When you and your buddy buy the same GPU but you wanted to save £50.

VRAM limitations are scenario specific ofc but my god. I don't know the technicalities of exactly what's happening here but I don't like it. :wtf:
You aren't going to be getting those frames with the 8gb cause you aren't going to max out the textures in this game. Youll be laughing all the way to the bank that your friend paid 50 pounds extra for practically little to no difference in texture quality
 
Page 30 and counting! WOW!



The question:
Are the 8 GB cards worth it?

And the answer:
Are you willing to turn down a setting or two (aka compromise)?
Are you mainly playing MP titles?

If you say yes to any of the above, then yeah, go for it and enjoy the purchase!

HOWEVER, if you're not willing to compromise, if you mainly play AAA SP titles at maxed out settings (aka uncompromised gaming), then please look elsewhere and get at least 12GB+ VRAM!

Much love <3
 
Page 30 and counting! WOW!
Amazing isn't it? Good or bad, it's a thing...
The question:
Are the 8 GB cards worth it?

And the answer:
Are you willing to turn down a setting or two (aka compromise)?
Are you mainly playing MP titles?

If you say yes to any of the above, then yeah, go for it and enjoy the purchase!

HOWEVER, if you're not willing to compromise, if you mainly play AAA SP titles at maxed out settings (aka uncompromised gaming), then please look elsewhere and get at least 12GB+ VRAM!
THIS! Yes sir! :rockout: It's really as simple as the above statement. Well said! :toast:
 
Let's agree to disagree on that one. There's no reliable mathematics behind random.
I have only ever received the the Steam survey 2 or 3 times, there isn't anything reliable or accurate about it.
Can we agree that the blame can be placed at the feet of Nvidia for juicing the market?
First with the crypto scam,and now AI so much that Nvidia isn't QA checking their drivers.
Youll be laughing all the way to the bank that your friend paid 50 pounds extra for practically little to no difference in texture quality
And then in a year or two your friend will be laughing because they can run the latest AAA titles without a game crashing and you can't.
 
I have done like 20 of them at least. I used to get them every other month or so, now maybe once a year.
 
I have only ever received the the Steam survey 2 or 3 times, there isn't anything reliable or accurate about it.

First with the crypto scam,and now AI so much that Nvidia isn't QA checking their drivers.

And then in a year or two your friend will be laughing because they can run the latest AAA titles without a game crashing and you can't.
Have you noticed the argument that because a 4090 struggles in some Games at 4K that people say the 7900XTX/XT are not fast enough for 4K? I have seen people say that only the 4090/5090 can play 4K. I have also seen people say that the 9070XT is faster than the 7900XT. KitGuru even manipulated to it showed that in their numbers. Well the truth is that Gaming is console support and as such if you have AMD hardware that is faster than the PS5 you laugh at all of this ridiculous noise. It is amazing how hardware socials reflect politics. Someone mentioned that the 7900XTX is 1/4 the price and a staff member responded that the 5090 was 43% faster. What is 43% in GPU numbers when monitors are just now getting 360hz support for 4x the cost?
 
And then in a year or two your friend will be laughing because they can run the latest AAA titles without a game crashing and you can't.
Sure, in a year or two suddenly we are going to be running 8k textures as bare minimum in games...attaboy
 
My 3070Ti is still ok.. some people are just brainwashed.
 
Let's just say, I'll never understand why Steam has to resort to random sampling instead of asking every user to participate in the survey. Nobody understands how their random works anyway.
I don't know what methodology steam is using, but in general random sampling (with a big enough sample pool) is pretty accurate actually.
 
"resort to".

You not understanding doesn't stop those that do from designing accurate data collection, thankfully.
So how do they design it, then? I don't want you to defend them. I want somebody to explain.

I don't know what methodology steam is using, but in general random sampling (with a big enough sample pool) is pretty accurate actually.
Its accuracy all depends on the sample. Which is... random. And thus, so is its accuracy.
 
Sure, in a year or two suddenly we are going to be running 8k textures as bare minimum in games...attaboy
no but the games are coded for 16gb unified memory. So right now 8gb cards are already riding the struggle bus since game engines don't manage assets appropriately for that memory size.

1746300779143.png
 
You aren't going to be getting those frames with the 8gb cause you aren't going to max out the textures in this game. Youll be laughing all the way to the bank that your friend paid 50 pounds extra for practically little to no difference in texture quality
That £50 is less than my weekly shopping. I'd gladly spend that much more for extra security for games that need the extra VRAM.
 
Its accuracy all depends on the sample. Which is... random. And thus, so is its accuracy.
That is just not true. Random sampling is actually the most accurate way of sampling a population.
 
no but the games are coded for 16gb unified memory. So right now 8gb cards are already riding the struggle bus since game engines don't manage assets appropriately for that memory size.

View attachment 397939
I've finished hogwarts prepatch on a 3060ti at 3440x1440p resolution with everything maxed out (including RT) DLSS Quality with just textures + shadows turned to high. HUB is turning everything to ultra cause he is being silly (i'd use a harsher word but whatever). You can bring any card to it's knees, including a 5090. So what

Let's agree to disagree once again.
There is no such things as agreeing to disagreeing here. The experts on the field have decided, it's not a debate. We might as well be debating the shape of the earth. Random sampling is incredibly accurate, period. If you can't accept it then you are just not accepting reality. Which is fine I guess, each to their own.
 
Can we agree that the blame can be placed at the feet of Nvidia for juicing the market?
No. Are they a part of the problem? Yes. However, they're not even close to the being the largest part.

HUB is turning everything to ultra cause he is being silly (i'd use a harsher word but whatever).
I'll say it, Steve Walton is being a jackass. Either he's being incompetent, which would not be a shocker, or he's deliberately skewing the results which would make him a fraud. I don't know which and don't care. HUB is a waste of time, can't be trusted and isn't worthy of even a single moment of consideration. This is far from the first time.

There is no such things as agreeing to disagreeing here. The experts on the field have decided, it's not a debate. We might as well be debating the shape of the earth. Random sampling is incredibly accurate, period. If you can't accept it then you are just not accepting reality. Which is fine I guess, each to their own.
On this point, I have to side with Auswolf, it's not as cut and dried like you imply. It's much more complicated, and that's the reality.
 
Last edited:
I've finished hogwarts prepatch on a 3060ti at 3440x1440p resolution with everything maxed out (including RT) DLSS Quality and textures + shadows turned to high. HUB is turning everything to ultra cause he is being silly (i'd use a harsher word but whatever). You can bring any card to it's knees, including a 5090. So what
it's not a so what -- the reason UE5 stutters like crazy on PC is because of it's design for unified memory -- it's designed for consoles, so having discrete GPU memory and RAM causes notorious stuttering when swapping assets back and forth -- same thing applies to the size memory buffer.

It is what it is. Xbox and PS5 are what devs code for and then ported to PC - most dev shops aren't spending time downgrading to 8GB, so 8GB as a result has inferior performance in new games.
 
Sure, in a year or two suddenly we are going to be running 8k textures as bare minimum in games...attaboy
No not really, some games are made to use at least 10-12GB of VRAM, and its only going to get worse as more studios use higher quality textures and the UE5 game engine.
I've finished hogwarts prepatch on a 3060ti at 3440x1440p resolution with everything maxed out (including RT) DLSS Quality and textures + shadows turned to high. HUB is turning everything to ultra cause he is being silly (i'd use a harsher word but whatever). You can bring any card to it's knees, including a 5090. So what
HUB turns settings to ultra to demonstrate pushing a card to the limit and how badly the 8GB card can struggle vs the 16GB version. And even with 16GB the 1% lows aren't great, with lower settings the 8GB card probably won't do any better. I find it sad that a 3060 12GB does better in Indiana Jones and Great Circle than the 4060 8GB.
There is no such things as agreeing to disagreeing here. The experts on the field have decided, it's not a debate. We might as well be debating the shape of the earth. Random sampling is incredibly accurate, period. If you can't accept it then you are just not accepting reality. Which is fine I guess, each to their own.
What experts? Who? :confused:
Random sampling is not statistically accurate information, I don't understand why some people take the Steam hardware survey as gospel other than the obvious, because it favorably displays the stats of the brand they like.
 
most dev shops aren't spending time downgrading to 8GB, so 8GB as a result has inferior performance in new games.
Of course they are. That's what low settings are. TLOU 2 for example runs on a 5500xt and a 4core cpu from 8 year ago at low settings. How is that possible if "devs don't spend time downgrading". They always have been and they always will, that's why graphics options exist in the game menu.

No not really, some games are made to use at least 10-12GB of VRAM, and its only going to get worse as more studios use higher quality textures and the UE5 game engine.
There are 0 games right now that need at least 10-12 gb of vram.
What experts? Who? :confused:
Random sampling is not statistically accurate information, I don't understand why some people take the Steam hardware survey as gospel other than the obvious, because it favorably displays the stats of the brand they like.
Im not taking the steam hardware survey as gospel cause I don't know what methodology they are using. Im saying that all experts in the field of statistics accept random sampling as one of the most accurate methods. Which is self evident anyways, simple statistics would tell you that it's much more unlikely that your random sample isn't accurate. You walk into a room with 1000 people, you sample 100 of them and 90% have an nvidia GPU. It's incredibly unlikely (like world altering unlikely) that the actual marketshare in that room is 50-50 and you somehow managed to randomly select 9 out of 10 having nvidia. It's more likely youll win the lottery and get hit by lighting while on your way to collect the money.
 
HUB turns settings to ultra to demonstrate pushing a card to the limit and how badly the 8GB card can struggle vs the 16GB version.
Except that HUB's data doesn't match everyone and given their history of bad numbers and benchmark results, they simply can't be trusted.

So can we stop talking about those fools?

And can we stop with the posting of this trash video? Continuously posting it does NOT make it important or valid.
 
Glad you asked. The US BLS surveys a large number of individuals, and subtracts from their total income taxes, housing, utility & food costs and other essentials. The remainder is rated discretionary. The DOE uses a simpler formula: they subtract from your income 150% of the baseline poverty level. Other surveys use different methodologies. You can quibble over the details but the facts remain: the average US resident has a great deal of money they have choice in how they spend, without starving to death or winding up homeless in the process.


Oops! If they were simply "waiting for a die shrink" then the 5090 would be using the same GP104 the 1070 did, only manufactured on 4nm. Nvidia has published metrics demonstrating that, on AI performance they've seen a 1000x performance increase in the last 10-12 years, only 10x of which came from die shrinks, and the rest from design improvements. Gaming has seen less uplift, but then NVidia is a trillion dollar company because of AI and datacenter markets; the trivial amount they make from gaming barely moves the needle any more.


Oops again! The "absolute" I stated was specifically limited to my own personal situation.


Which does absolutely no good if you're using games that don't need the additional VRAM. Why pay "nearly the same" (i.e. more) for a card that consumes more power, for no benefit whatsoever?

Let's ask the proper question. Why does the mere existence of an 8GB variant offend you so extremely? They're offering consumers choice, which is always -- always -- a good thing. If the card doesn't suit you, don't buy it.


Don't place words into my mouth. The context of the argument was whether the card was affordable, not whether one "must" purchase it in order to qualify for some mythical enthusiast title.


Stop attempting to redefine the English language. VRAM isn't "shrinking". It's simply not expanding at the pace you wish.


A mighty attempt at goalpost moving, but we're not speaking of professional use. Benchmarks confirm that the majority of games released in in the last year -- not "in 2015" -- play as fast on an 8GB card as they do at 16GB ... as long as you're not in 4K resolution.


I missed this whopper the first time around. You, sir, are confusing mean and median. By definition, exactly 50% of people -- not 5% -- earn more than the median income.

Listen, after reading through all of this I can finally get a feel for you. The feel is that you have no empathy, just like other people, and have an issue with a non simplified answer. I'm learning that more and more people just simply cannot understand a separate perspective...so it's time to stop trying to explain myself to someone who actively doesn't want to understand. You on the other hand, I get. I don't know if you're a fanboy or someone that Nvidia's practices financially benefit, but you don't want to have a competing perspective.

My complex answer is that if there's an 8GB 5060ti...which there is, then it's only for a person looking backward. It's Nvidia releasing a thing that's not at limits everywhere today, but it sees that before the next generation enters service those limits will be reached. If priced that way, we're good. It is not. It's also silly that there's quibbling about 429, 379, or whatever dollar amount you want to set. Street price for most of these is well in excess of $500...so moaning about that is silly when things can't be had for that price period.

Let me also suggest that I'm pissed personally that Nvidia is charging price premiums for dirt cheap components. Their development is entirely focused on the AI side of things...which is great for AI applications. It's like if GM was suddenly to announce spending 90% of their research capital into the infotainment systems of their vehicles while ignoring their actual engines and transmissions. It's development for the thing that isn't the primary focus. The "benefit" of a nice infotainment system is about on-par with shiny new interpolative models to make it look like you have the hardware to generate what the previous generation did...yet the 5060 and 5060tis are positioned for minimal improvements in the same target resolutions as their predecessors. Heck, it's almost like most of the gains are interpolative...which would largely not be true if you increased VRAM. Nah, just decrease the bus 33% and estimate the functional frequency twice as high so you can prove on paper the throughput is higher...despite DRAM literally being one of the cheapest things you can implement. That isn't me saying that, that's Nvidia's performance numbers using the same silicon showing the addition of VRAM above 8 GB for the 5060ti has direct improvement over the 8GB version.

Consider our differences in that the 8GB 5060ti in my book is a joke. It would have been easier to miss if the joke was 8GB 5060 and 16GB 5060ti, but instead of hiding that from us Nvidia decided to create an artificial performance segment. That's what sticks in my craw, because they basically outed their own crap, and despite this we are expected to simply be fine with them pushing a product to market at premium pricing but with a visible timer for how long it'll be relevant. In some cases that timer was up day one...but you want to argue that Nvidia did right. My retort is what I started with, they did right for the shareholders and profitability, but they did not bring the best thing they could to the table for consumers...and are not pricing their offerings aggressively. That is a loss for a healthy consumer market...which we happen to not be in.
 
There is no such things as agreeing to disagreeing here. The experts on the field have decided, it's not a debate. We might as well be debating the shape of the earth. Random sampling is incredibly accurate, period. If you can't accept it then you are just not accepting reality. Which is fine I guess, each to their own.
Tell me how these "experts" measure the accuracy of random sampling, then. Otherwise, I'll keep disagreeing until the end of days.
Who are those experts anyway?
 
Of course they are. That's what low settings are. TLOU 2 for example runs on a 5500xt and a 4core cpu from 8 year ago at low settings. How is that possible if "devs don't spend time downgrading". They always have been and they always will, that's why graphics options exist in the game menu.
Look if you want to go play on a 5500xt and live in denial go ahead.

I have a 4060 8GB gaming laptop that I love to bits, and game on all the time -- but yeah in newer titles it definitely struggles with frametime spikes even on lower settings -- if I was to buy something today it would be a unified 32GB AMD Strix Halo over a 4070 8gb or a 5060 8gb in a hearbeat.

8GB realistically is the lowest you can go atm, and having been through the 512MB, 1Gb, 2Gb, 4Gb, and now 8Gb debates this situation is no different. 8Gb owners are already starting to have a rough time right now, and that time is going to get much worse in the next few years.

And can we stop with the posting of this trash video? Continuously posting it does NOT make it important or valid.
Guy is literally showing games side by side on the same card with different framebuffers... It's not valid because you don't approve of the settings or what?
 
Status
Not open for further replies.
Back
Top