• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are the 8 GB cards worth it?

Would you use an 8GB graphics card, and why?

  • No, 8 GB is not enough in 2025

  • Yes, because I think I can play low settings at 1080p

  • I will explain in the comments section


Results are only viewable after voting.
Status
Not open for further replies.
Don't try and misrepresent me, implying that someone earning $30k is middle class, which is what I wrote, that's the salary of someone who works for $15 an hour, quite far from "middle class". As if my logic was as simple as "the smaller number fits into the bigger number". Quoting unrelated statistics may sound intelligent, but it's an irrelevant response to the point I made - that a middle class person can comfortably afford a 5090 if they want to - which is true.



View attachment 397702
You still have no right to decide who can afford what. You do not know anything about other people's circumstances that affect their financial situation. Implying so is extremely arrogant. Your income and your hobby are just two small pieces of the puzzle we call life.

Edit: Thinking that the world is made up of middle class Americans is just as arrogant.
 
It's simply their "professional" series, it's accurate saying that the card isn't a version of the 9070 XT. Just like the RTX 6000 isn't a version of the RTX 5090, even if they're based on the same silicon.
It all depends on branding. If the rumor is accurate that it'll be branded the "9070 XTX", then yes, it's the same series.

Just to use the US as an example, 65 to 78% of Americans live paycheck to paycheck with a monthly discretionary income of $1,729 USD in 2018 (newer studies with limited data suggest an even smaller amount today, $250). [it] does not mean people can afford it. All statistics indicate that the card is incredibly unaffordable
If you have a discretionary income of $1700 or even $250 you can afford the card. It does mean, though, cutting back in other areas and saving for a period of time. The card is far from unobtainium; even many $15/hour fast food workers can buy one -- if they're not attempting to support a family on that figure.

You still have no right to decide who can afford what. You do not know anything about other people's circumstances that affect their financial situation. Implying so is extremely arrogant.
Spare us the false outrage. He's not deciding anything; government statistics and basic math are doing that. By definition, having discretionary income means you have a choice on how to spend it, on non-essentials beyond food, housing, and medical care.
 
It all depends on branding. If the rumor is accurate that it'll be branded the "9070 XTX", then yes, it's the same series.


If you have a discretionary income of $1700 or even $250 you can afford the card. It does mean, though, cutting discretionary spending back in other areas and saving for a period of time. The card is far from unobtainium; even most $15/hour fast food workers can buy one -- if they're not attempting to support a family on that figure.


Spare us the false outrage. He's not deciding anything; government statistics and basic math are doing that. By definition, having discretionary income means you have a choice on how to spend it, on non-essentials beyond food, housing, and medical care.
Finally, someone applying that "critical thinking" that's been mentioned.
 
Spare us the false outrage. He's not deciding anything; government statistics and basic math are doing that. By definition, having discretionary income means you have a choice on how to spend it, on non-essentials beyond food, housing, and medical care.
So how do you decide how much of my income is discretionary? Hm?
 
You actually agreed with the opposing point. PC enthusiasts are willing to spend "small fortunes" to purchase hardware they covet, in all contradiction to the performance that hardware has. Collectively, I've spent more on mechanical keyboards in the last few years than the price of a 5090. Am I an enthusiast? No one said "enthusiast = spending money" or "buying the very fastest". It's about not allowing money (and time) to stand in the way. Your enthusiast who restores the old PC may only spend "10 quid" on the purchase -- but he very well may spend hundreds of hours of his time restoring it.


The fact you consider this in any way relevant demonstrates a rather clear emotional bias.


Those who speak in such absolutes are zealots promoting a cause. For my own personal needs, NVidia cards are orders of magnitude better than any competitive offering, for the sheer sake of CUDA alone. And I won't even mention the horrifically bad power efficiency of most AMD cards.


Puerile analogies, since card makers aren't subtracting features. A better analogy is to note that the automatic windows of today's cars roll up no faster than those from the 1980s. Outrageous! These automakers are cheating us blind!

Honestly, this hyperfixation on VRAM to the exclusion of all else is beyond absurd. Even in the face of benchmarks repeatedly showing that 8GB is plenty for 1080p and 1440p resolutions, you continue the farce. It's a mentality that explains why AMD is currently working on 32GB variants that, despite performing no better for you gamers, will still be bought in droves, simply so you can come here and boast the size of your VRAM buffers. The customer is always right ... even when they're not.

1) The point was, as I read it, that enthusiast items were how you sold things like extra RAM, and how in parallel people go out and buy silly expensive super cars. My point was that this is not the sole domain of enthusiasm...because even ancient cars have enthusiasts. The relative cost is not a point I made, because I also cited gutting antique cars and reinstalling new hardware. You want to make enthusiast about cost. My point is enthusiasm is not defined by cost.

2) Nvidia is not constrained by funding for design. The reason their fluid capital matters is the same reason that Nissan exists. Because they don't have a ton of money they cannot afford the research to develop anything new. Nvidia can, and instead of developing new they've decided they are a software company. This is a fact, that you want to read as me being against them how? I do love that you read things in that are not there to suit your bias...but the core argument that they have money is a statement that they likewise should be doing something to move development forward, rather than simply waiting for another company to give them a die shrink so they can move forward.
There is an argument to be made that they are doing this with frame generation...but it's a bad one. It's also a not so great look when you as a company want to put most of your eggs into the AI driven improvement basket while offering lower end cards with insufficient VRAM to support meaningful AI...the core debate of this whole thread.

3) You see, this is just bull ****. You quote without context, and offer an absolute as answer to what you claim for an absolute. It seems really meaningful when you do it...until you think for two whole nanoseconds and discover that you wanted to seem profound. Let me simplify this for you. Near the same price point AMD offers more VRAM, of the same type. Intel offers more VRAM, and they offer competing Ray trace. Which is best? The answer is none of them. There is no clear winner. There's a case for all of the cards at that competitive model, which means by definition they do not make the best part, but do control its placement in the market to make it as profitable for them as possible. This would be absolutely different if there wasn't an 8 GB and 16 GB model at the same performance level...but instead of making a product that nobody else can compete with they, like Intel, have decided to do just enough. Great for competition, not so much for best.

4) Do you live under a rock? I ask because the issue with PhysX is something you should understand. The ever shrinking bus, that Nvidia claims is fine because they manage memory better is a feature. The fact that they've been behind on VRAM quantity is a lack of a feature. I...want to give you the benefit of doubt, but that'd also be drinking the proverbial kool-aid of what Nvidia claims. It's just as bad as AMD's kool-aid.

5) Hyperfixating is you wanting to project. You came to a thread asking if 8 GB cards were worth it, and complain when people are fixated on VRAM. Isn't that like going to a McDonalds, and complaining that their dinner menu has too many hamburgers on it? I mean, if you cherry pick and exclude all cases where VRAM is required then you'll definitely find that it isn't a factor. I on the other hand look at the programs I can use professionally, and will tell you that if ZLUDA wasn't assassinated by Nvidia then the difference between the 8 GB card and 16 GB card is the difference between crashing and not...so my money would be on the 9070 or 9070xt. If I ran ray tracing I'd tell you the extra VRAM is the difference between smooth and choppy. If I run games from 2015 I'd say 8 GB is more than I'll be needing for a long time. People ask about this because a $300 MSRP card, with a $500 street price, with an available +$100 street price to double the VRAM is a large question mark. As such, it's important for people to which a $500 expenditure is substantial to know whether that 20% increase in price matters. I say that this is a bad choice that Nvidia forced upon the market segment with a 5060, 5060ti 8 GB, and the 5060ti 16 GB. You're welcome to say otherwise, but the 2060 is a joke in terms of adoption. The 3060 is almost beating the 4060 despite being "replaced." The 3060 8 GB vs 12 GB comes down to the 12 GB pulling 25 watts less, which is attributable to not having to constantly bang against memory limits and do intensive calls.


Last bit here....the 32GB version of the card is not for gamers. I have no idea why you would assume that. If you looked at the first few people on this forum to respond to that rumor thread you'd be able to understand that the 32GB variant is AMD offering a soft solution to people wanting an AI accelerator or professional hardware that burns through VRAM, without having to spend a small fortune. I'd nominate for use in 3D rendering, general rendering, 3D scanning, and LLM playtime. With the jump from 16 to 32 you are not doing it for gaming unless you want 4k/8k. You are doing it to satiate stupidly large amounts of data. You're welcome to believe otherwise, but I'll bet my left pinky finger that this'll be the discussion again in 4-6 years when both AMD and Nvidia release crap that has to support whatever the newest toy is. TressFX, Ray Tracing, or whatever it's called at the time.
 
If you have a discretionary income of $1700 or even $250 you can afford the card. It does mean, though, cutting back in other areas and saving for a period of time. The card is far from unobtainium; even many $15/hour fast food workers can buy one -- if they're not attempting to support a family on that figure.
So if I don't want to buy that card because I consider it a bad deal, despite being able to afford it if I could, I'm less of an enthusiast than Random Joe who walks into a store with a fully loaded credit card and walks out with a 5090, despite knowing absolutely f*k all about computers?
 
5) Hyperfixating is you wanting to project. You came to a thread asking if 8 GB cards were worth it, and complain when people are fixated on VRAM. Isn't that like going to a McDonalds, and complaining that their dinner menu has too many hamburgers on it? I mean, if you cherry pick and exclude all cases where VRAM is required then you'll definitely find that it isn't a factor. I on the other hand look at the programs I can use professionally, and will tell you that if ZLUDA wasn't assassinated by Nvidia then the difference between the 8 GB card and 16 GB card is the difference between crashing and not...so my money would be on the 9070 or 9070xt.
If you use a card "professionally" and spend $3-400 on it, then perhaps you should consider investing more in your profession, considering it's a literal tax write-off. It's safe to say most people considering the 8 GB 5060 Ti won't be using it "professionally".
If I ran ray tracing I'd tell you the extra VRAM is the difference between smooth and choppy. If I run games from 2015 I'd say 8 GB is more than I'll be needing for a long time. People ask about this because a $300 MSRP card, with a $500 street price, with an available +$100 street price to double the VRAM is a large question mark.
MSRP is $380 for the Ti, not $300. Assuming you are referring to that, since the base 5060 isn't released yet, so cannot have a "street price". $440 card with a $490 16 GB variant (quite the steal for a brand new 16 GB CUDA card), not a $500 card with a $600 16 GB variant like you are claiming - going off in stock pricing from pcpartpicker for the USA, which is arguably worst case scenario for countries to buy a GPU.

It's good to get facts straight.
 
I think we can all agree that the 5060Ti and its variants are not enthusiast cards.

Good job on restraining yourselves btw.
 
So if I don't want to buy that card because I consider it a bad deal, despite being able to afford it if I could, I'm less of an enthusiast than Random Joe who walks into a store with a fully loaded credit card and walks out with a 5090, despite knowing absolutely f*k all about computers?

Man, they don't want to listen to another opinion.

I classify the guy who drops a million dollars on something just as much of an enthusiast as the guy who's broke and races in his mom's minivan that he spray painted red. Enthusiasm has nothing to do with a price point, but the argument seems to be that because people who buy Civics exist then there has to be someone out there who doesn't buy Porsches in the same way. I have watched a man buy a Porsche, drive it for 6 months, and sell it for another Porsche because he got tired of the interior's color. Rich non-care and poor non-care exist just as much as rich/poor enthusiasts.


Likewise, some people cannot fathom not buying the best thing you can get. I'll admit that I was that more than a decade ago. I had to have a 3930k...and lived with it for a decade. Buying the best you can get is silly, because if I bought basic Sandybridge I'd have had almost $500 less in expenditures...which if it was just dumped into a checking account would have bought me a system upgrade 5 years later that made the 3930k look like the clown shoes it was. They also cannot fathom wanting little Timmy from down the street to have a crack at affording a gaming PC...because those miserable memes and videos about the PC master race are more accurate than anyone rational would like to admit.


Screw it though. I'm going to the local car racing bowl with a cheap *** Honda and enjoying my time driving. Screw winning, screw being the best, screw any company that wants to hide function behind a paywall that would literally cost them less than $10 to completely obliterate, but in 2025 it's still fine to sell cards that cost more than previous generation consoles that are crippled by their VRAM limitations. This is watching people defend their corporate handlers, and blaming the chafing of their collars on the people who wanted the collars removed.
 
I just buy what I can afford, barely. I tune the shit out of it, and that is what I have. Sure I would like to have the best money can buy, and there was a time when I did buy that kind of hardware.. just has been a long time lol.. I was a young guy with no responsibility, now I have the weight of the world on my shoulders.

Man.. I don't even know how to use MS Office.

But I can tune hardware.. :laugh:
 
I just buy what I can afford, barely. I tune the shit out of it, and that is what I have. Sure I would like to have the best money can buy, and there was a time when I did buy that kind of hardware.. just has been a long time lol.. I was a young guy with no responsibility, now I have the weight of the world on my shoulders.

Man.. I don't even know how to use MS Office.

But I can tune hardware.. :laugh:
Noone uses Office, MS Office uses you for their AI model.
 
I just buy what I can afford, barely. I tune the shit out of it, and that is what I have. Sure I would like to have the best money can buy, and there was a time when I did buy that kind of hardware.. just has been a long time lol.. I was a young guy with no responsibility, now I have the weight of the world on my shoulders.

Man.. I don't even know how to use MS Office.

But I can tune hardware.. :laugh:
Exactly my point, thanks. :)

Being hardware enthusiast is not about the money you mindlessly dump into your PC.
 
Being hardware enthusiast is not about the money you mindlessly dump into your PC.
Looks over to a second set of DDR5 that barely works with the first set..

Sigh.

I also have a few coolers..
 
Looks over to a second set of DDR5 that barely works with the first set..

Sigh.

I also have a few coolers..
I see what you mean.

To be fair, if I had spare 2 grand to blow on PC parts right now, I'd much rather buy a whole bunch of stuff to build another system than just a single graphics card. Building, tuning and testing is way more fun than watching my FPS counter (and my electricity meter) reach for the stars.
 
If you use a card "professionally" and spend $3-400 on it, then perhaps you should consider investing more in your profession, considering it's a literal tax write-off. It's safe to say most people considering the 8 GB 5060 Ti won't be using it "professionally".

MSRP is $380 for the Ti, not $300. Assuming you are referring to that, since the base 5060 isn't released yet, so cannot have a "street price". $440 card with a $490 16 GB variant (quite the steal for a brand new 16 GB CUDA card), not a $500 card with a $600 16 GB variant like you are claiming - going off in stock pricing from pcpartpicker for the USA, which is arguably worst case scenario for countries to buy a GPU.

It's good to get facts straight.

I...wow. Cool. You want to die on the fact that it's too expensive, and tell me that I was under quoting. I'm not sure if that's a pedantic win, or an attempt to justify the $500 as not being so bad if the MSRP is $400 to begin with.


I own a 3080. Cool. It's one of the 10 GB versions. I needed to provide it, professionally, because the 5080 we bought was killed by Nvidia's ***** drivers. The difference between a usable 10 GB 3080 and a non-usable 8 GB card is real, despite the 40 series card being newer. I then had to purchase another laptop to get a 4090 because that's the only laptop with enough VRAM.


Let me finish this thought process with you, because you seem to be incapable of empathy. Not everybody is an independent creator. Some people want to buy a professional card, and actually have to pay the money. It's great you use it as a tax write-off, but assuming that's OK for everyone is 100% weapons grade stupid. Now that I've explained basic reality, let me suggest that literally any upgrade in the last two generations would have made 16 GB the standard for xx60 level cards from Nvidia...but instead they make the process cheaper with a smaller bus and try to make-up the difference with faster VRAM that calculates out to higher throughput. Cool...if there's no other value for VRAM. Cool if its cost hasn't plummeted. Cool if you want to ride the government for tax credits...because you deserve to have your "fair share" of the burden of running your country slide into the pockets of Nvidia and AMD. I mean, it's not like there's any other reason taxes are collected.
My politics aside, somebody will pay for a card. People will find value in cheap and effective 3D scanners that don't cost $60k...and require a $3k laptop on top of that. It's always the people that claim the government should pay for your stuff that want you to spend more. It's almost like the rhetoric that we move forward is always built of the backs of everyone else paying more. It's not like Nvidia could possibly afford to sell their product at more reasonable prices...exemplified by both Nvidia and AMD changing prices on a dime based upon each other's announcements rather than market forces. They could never afford to make less than a 75% profit margin (basic google search).

In the US, the average farmer exists on subsidies, and has a gross profit margin of 12-15%. If I do the math, Nvidia is telling me they are 5 to 6.25 times as important as the average farmer. That registers to me as a gigantic middle finger when they want to charge (429-379) $50 for less than 10 dollars worth of DRAM chips. Of course...you are welcome to swallow that hot steaming mug of fecal matter. If there is direct evidence today that 8GB versions of this card are choked on recent games then we will see a time when this crap hobbles an otherwise decent card to get Nvidia another few dollars in profit on top of their already silly GP.
 
I...wow. Cool. You want to die on the fact that it's too expensive, and tell me that I was under quoting. I'm not sure if that's a pedantic win, or an attempt to justify the $500 as not being so bad if the MSRP is $400 to begin with.
How about you don't feign shock that your... unintentional... I'm sure... misquoting of prices by a factor of 25% in a discussion about prices is not, in fact, harmless.
 
So how do you decide how much of my income is discretionary? Hm?
Glad you asked. The US BLS surveys a large number of individuals, and subtracts from their total income taxes, housing, utility & food costs and other essentials. The remainder is rated discretionary. The DOE uses a simpler formula: they subtract from your income 150% of the baseline poverty level. Other surveys use different methodologies. You can quibble over the details but the facts remain: the average US resident has a great deal of money they have choice in how they spend, without starving to death or winding up homeless in the process.

2) Nvidia is not constrained by funding for design. ... they have money is a statement that they likewise should be doing something to move development forward, rather than simply waiting for another company to give them a die shrink
Oops! If they were simply "waiting for a die shrink" then the 5090 would be using the same GP104 the 1070 did, only manufactured on 4nm. Nvidia has published metrics demonstrating that, on AI performance they've seen a 1000x performance increase in the last 10-12 years, only 10x of which came from die shrinks, and the rest from design improvements. Gaming has seen less uplift, but then NVidia is a trillion dollar company because of AI and datacenter markets; the trivial amount they make from gaming barely moves the needle any more.

3) You see, this is just bull ****. You quote without context, and offer an absolute as answer to what you claim for an absolute.
Oops again! The "absolute" I stated was specifically limited to my own personal situation.

Let me simplify this for you. Near the same price point AMD offers more VRAM, of the same type....
Which does absolutely no good if you're using games that don't need the additional VRAM. Why pay "nearly the same" (i.e. more) for a card that consumes more power, for no benefit whatsoever?

Let's ask the proper question. Why does the mere existence of an 8GB variant offend you so extremely? They're offering consumers choice, which is always -- always -- a good thing. If the card doesn't suit you, don't buy it.

So if I don't want to buy that card because I consider it a bad deal, despite being able to afford it if I could, I'm less of an enthusiast than Random Joe who walks into a store with a fully loaded credit card and walks out with a 5090, despite knowing absolutely f*k all about computers?
Don't place words into my mouth. The context of the argument was whether the card was affordable, not whether one "must" purchase it in order to qualify for some mythical enthusiast title.

4) ... I ask because the issue with PhysX is something you should understand. The ever shrinking bus, that Nvidia claims is fine because they manage memory better is a feature. The fact that they've been behind on VRAM quantity is a lack of a feature
Stop attempting to redefine the English language. VRAM isn't "shrinking". It's simply not expanding at the pace you wish.

I on the other hand look at the programs I can use professionally, and will tell you that if ZLUDA wasn't assassinated by Nvidia then the difference between the 8 GB card and 16 GB card is the difference between crashing and not ... If I run games from 2015 I'd say 8 GB is more than I'll be needing for a long time.
A mighty attempt at goalpost moving, but we're not speaking of professional use. Benchmarks confirm that the majority of games released in in the last year -- not "in 2015" -- play as fast on an 8GB card as they do at 16GB ... as long as you're not in 4K resolution.

Not to mention, the median income value is not what people actually earn. It is usually made up of the top 5% of people earning millions, and the other 95% living day by day on, or close to minimum wage.
I missed this whopper the first time around. You, sir, are confusing mean and median. By definition, exactly 50% of people -- not 5% -- earn more than the median income.
 
Last edited:
Glad you asked. The US BLS surveys a large number of individuals, and subtracts from their total income taxes, housing, utility & food costs and other essentials. The remainder is rated discretionary. The DOE uses a simpler formula: they subtract from your income 150% of the baseline poverty level. Other surveys use different methodologies. You can quibble over the details but the facts remain: the average US resident has a great deal of money they have choice in how they spend, without starving to death or winding up homeless in the process.
And that buying choice entities you to judge how much of a hardware enthusiast they are because...?

Also, tell me, who's more of an enthusiast? One who spends 2k on a new PC, or one who spends 2k on a 5090?

Also, an 18 year-old American kid who just got a brand new top-end PC as a graduation present from high school, or a 36 year-old south east Asian bloke who's been building PCs all his life, but could never afford anything above midrange hardware?

Edit: Also, a middle class bloke who's just bought a 5090 on a two-year personal loan, or a middle class bloke who had to skip the last 3 generations only to settle for a 5060 Ti in the end because of an unexpected medical bill?

We're all individuals, you cannot and should not judge anyone's level of PC literacy, interest and skill in the hobby (which is what an enthusiast is) based purely on national income levels and spending habits.

I missed this whopper the first time around. You, sir, are confusing mean and median. By definition, exactly 50% of people -- not 5% -- earn more than the median income.
Fair enough, thanks. Still, my above point stands.
 
Last edited:
8 GB are not enough, end of story.
Better to have a 10-12GB card that will be a few percent slower GPU.
Windows uses ~2-2.2GB by default, and with a few programs running the VRAM used can jump to 3GB and that's without a single game running.
 
Long answer?
No

Short answer?
n
 
It is painful reading seeing people defending the 8 gig variant.
 
It is painful reading seeing people defending the 8 gig variant.
the threads not "should new 8gb cards be produced"
the thread is "are 8gb cards usable"
 
GDDR6 is a lesson learnt for AMD who tried with HBM/2 and failed badly.
They invented Infinity Cache which guarantees that you don't need fast VRAM memory.

Inconsequential, the 5090 has more L2 in it than all tiers of cache on the 9070 XT combined, in addition to the twice as wide bus and fast GDDR7 memory. It offers the best of both worlds.

And yet rumors of a 32GB 9070 XTX still persist:

"although Frank Azor denied that there will be no 32GB GDDR6 memory version of Radeon RX 9070 XT, our sources confirmed that AMD is developing products with 32GB GDDR6 memory configuration with AIB partners. The GPU chip used is code-named Navi 48, the same as the Radeon RX 9070 series...."


AMD fans are known for being hopeless dreamers. A 32 GB N48 for gaming purposes is utterly worthless, the hardware is too weak.
 
AMD fans are known for being hopeless dreamers. A 32 GB N48 for gaming purposes is utterly worthless, the hardware is too weak.

Not it's not, it would just use 630 watts.
 
Status
Not open for further replies.
Back
Top