• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are the 8 GB cards worth it?

Would you use an 8GB graphics card, and why?

  • No, 8 GB is not enough in 2025

  • Yes, because I think I can play low settings at 1080p

  • I will explain in the comments section


Results are only viewable after voting.
Status
Not open for further replies.
XeSS runs better at higher quality on Arc hardware, others use compatibility kernel, but yes.
Fair point, not too familiar with intels lineup.
 
Does TPU list VRAM/DRAM usage in testing anywhere? I don't expect them to add this category if not but from video reviewer testing 8GB shows scenarios where it's swapping into DRAM but doesn't show any obvious issues.

As with all things on this topic, it's still game dependent. Sometimes the higher VRAM card is even using more DRAM in comparison but then is using less when settings are increased. But I wonder if that swap is creating scenarios where 8GB appears to play fine when it's actually already ran out. :confused:

Can't make any confirmations without the testing just yet though. Those numbers would be interesting.
 
Last edited:
There is no reason to buy an 8GB card in 2025
Well, there is no reason to buy anything below a 5090 in 2025. Problem is cost. Cards aren't free. When the 16gb 5060ti is 80-100$ more then it doesn't make sense over the 8gb one.

What I find fascinating in HUB's video is his drive to oversell the 16GB 5060ti. He even went as far as saying that it delivers a breathtaking 4k experience. So let me get this straight, a 450$ GPU offers a breathtaking 4k experience (a thing unprecedent in the past btw) but he somehow had to find a way to crap on nvidia? Aokay.
 
I mean, it objectively IS a con if you want DLSS. Not sure what your point is here. If you want FSR or XeSS I believe those run on any brand. They aren't commenting on nvidias fairness at all by closing their models, just that if you want them there is only one way to get them. And love it or hate it, that's true.
Why would someone only want DLSS? Because all of the influencers overhype DLSS as being the only option the point most people aren't aware of FSR or XeSS.
It isn't about Nvidia leaving their upscaler closed to other brands, it's about making biased opinionated conclusions on what should be an objective review.
Well tough cookies for you then, honestly with inflation these days 80 bucks is chump change. Price points aren't really up for discussion here anyways, from a vendors point of view. Nvidia and AMD could totally just not cater to gamers at all and not suffer a ton if at all. We aren't in the drivers seat anymore.
Except $80 isn't pocket change to most people, but good for you. All the greed and inflation is why people also complain about $50-60 games becoming $80-100 games, and much like many modern games these days you're paying more to get less of a graphics card. Nvidia could easily get away with dropping out of the consumer market, consumer products only account for around 6-7% of their revenue, AMD not so much since they're a small company in comparison, and have found other ways to innovate in graphics since Nvidia has effectively monopolized the consumer market.
Well, there is no reason to buy anything below a 5090 in 2025. Problem is cost. Cards aren't free. When the 16gb 5060ti is 80-100$ more then it doesn't make sense over the 8gb one.

What I find fascinating in HUB's video is his drive to oversell the 16GB 5060ti. He even went as far as saying that it delivers a breathtaking 4k experience. So let me get this straight, a 450$ GPU offers a breathtaking 4k experience (a thing unprecedent in the past btw) but he somehow had to find a way to crap on nvidia? Aokay.
How exactly is HUB overselling the 16GB 5060ti? The 16Gb 5060Ti is the only one that should exist though, and shouldn't be $80 more than the 8GB version. I think someone in this thread, or another thread listed the price of GDDR7 modules, they aren't exactly expensive.
I find it weird how people keep defending planned obsolescence, but since people keep handing money to companies for it, I think it's deserved at this point.
And yes a $400-450 GPU in 2025 should be capable of 4K, at least in some instances, if people would've always given excuses like this or that card is for a certain resolution we would all still be stuck on 1080p.
 
Why would someone only want DLSS?
Because they view it as better? That's not an uncommon view btw, and no, I do not buy the idea that's solely due to influencer activities. That's conspiracy hogwash, frankly.

Except $80 isn't pocket change to most people, but good for you.
Ok maybe pocket change was hyperbole but the rest of the point stands. At any measure, 80 bucks barely buys a fast food dinner for a larger family these days. Good luck getting much farther with it. The truth hurts yeah, but it doesn't bend.
 
Why are we still having this conversation? Newsflash, in 2025, there's plenty of hardware that isn't enough for gaming, and it's time to let things improve for the better. Why embarass yourself by coping that 8GB of VRAM is acceptable on a card over 50% faster than a 1080 Ti?
 
Why would someone only want DLSS? Because all of the influencers overhype DLSS as being the only option the point most people aren't aware of FSR or XeSS.
I don't know if they were necessarily overhyping it but I personally wasn't buying into spending £50-£200 more for an upscaler. The premium made sense if you'd use their full feature suite (including creation) but DLSS alone wasn't that convincing (imo).
 
I don't know if they were necessarily overhyping it but I personally wasn't buying into spending £50-£200 more for an upscaler. The premium made sense if you'd use their full feature suite (including creation) but DLSS alone wasn't that convincing (imo).
It's double or triple the original tier cost of the card.

Thats the main issue with the RTX 5060 8gb not that it's 8gbs that it cost far higher than what it was originally tiered as & it's not inflation either.
IMO the cost is from Nvidia's massive 720,000 H100 (which maybe/ or have been getting updated) that can use 700 watts of power than have been running for the past 7 years straight. That's 504,000,000 watts or 504 megawatts, & it's been running not stop for 7 years well that's around 3,520 megawatts or 3.5 Gigawatts.
 
How exactly is HUB overselling the 16GB 5060ti? The 16Gb 5060Ti is the only one that should exist though, and shouldn't be $80 more than the 8GB version. I think someone in this thread, or another thread listed the price of GDDR7 modules, they aren't exactly expensive.
I find it weird how people keep defending planned obsolescence, but since people keep handing money to companies for it, I think it's deserved at this point.
And yes a $400-450 GPU in 2025 should be capable of 4K, at least in some instances, if people would've always given excuses like this or that card is for a certain resolution we would all still be stuck on 1080p.
If you don't think 16gb is even worth 80$ over 8gb then how can you at the same breath claim how terrible 8gb are? That doesn't make sense. If you are not willing to spend 80$ to get to 16 then we are in agreement, 16gb won't really change your experience that much to warrant 80 bucks and 8gb is perfectly fine.

And following your train of though, the 5090 is the only cars that should exist. I find it weird people keep defending slow cards like the 9070xt and the 5070ti existing. Nope, 5090 or buy a console. Right?

@AusWolf Since you asked and I just tried the game (oblivion) I have to make heavy compromises on a 4090. Can't use DLDSR (card is too slow), and even with DLSS enabled and Hardware Lumen completely off game drops to the 50s (gpu bound). I have to further tune the settings down or go to dlss balanced (maybe performance?) to get a locked 60. Compromises have to be made regardless of your gpu, modern games are heavy, it is what it is.
 
Last edited:
16gb won't really change your experience that much to warrant 80 bucks and 8gb is perfectly fine.
Come on...After 14 pages here this sentence is obviously not true. The 5060 Ti 16GB being $430 is a different conversation (personally I think it's okay for a NVIDIA card, not great, not terrible).

I won't repeat what's already been said but this hard defence for 8GB is simply strange to me. There's no reason not to pressure AMD/NVIDIA to higher standards. Doesn't mean developers get a pass, they are blamed plentifully right now.
 
If you don't think 16gb is even worth 80$ over 8gb then how can you at the same breath claim how terrible 8gb are? That doesn't make sense. If you are not willing to spend 80$ to get to 16 then we are in agreement, 16gb won't really change your experience that much to warrant 80 bucks and 8gb is perfectly fine.

And following your train of though, the 5090 is the only cars that should exist. I find it weird people keep defending slow cards like the 9070xt and the 5070ti existing. Nope, 5090 or buy a console. Right?

@AusWolf Since you asked and I just tried the game (oblivion) I have to make heavy compromises on a 4090. Can't use DLDSR (card is too slow), and even with DLSS enabled and Hardware Lumen completely off game drops to the 50s (gpu bound). I have to further tune the settings down or go to dlss balanced (maybe performance?) to get a locked 60. Compromises have to be made regardless of your gpu, modern games are heavy, it is what it is.
And following your train of thought, because you have to make compromises with a GPU's settings at every tier, an 8GB 5090 would be acceptable if it were $240 less.
 
There's no reason not to pressure AMD/NVIDIA to higher standards
I am pressuring them to higher standards. But in this case, since they have both an 8gb and a 16gb option for you to choose what fits your needs, what should I pressure them about? The higher standard IS for them to offer a cheaper alternative for those that don't need 16gb and don't want to pay extra for it and nvidia delivered it.

And following your train of thought, because you have to make compromises with a GPU's settings at every tier, an 8GB 5090 would be acceptable if it were $240 less.
Absolutely 100% true, if you can choose between a 32gb 5090, a 24gb 5090 and a 16gb 5090 that would be godsend actually. I don't see how anyone can disagree with that. It's bonkers. Lot's of people would choose to save 200$ and get the 16gb instead of the 32.
 
I am pressuring them to higher standards. But in this case, since they have both an 8gb and a 16gb option for you to choose what fits your needs, what should I pressure them about? The higher standard IS for them to offer a cheaper alternative for those that don't need 16gb and don't want to pay extra for it and nvidia delivered it.
Pressure for value, not just options. NVIDIA gave us the option to buy a 4080 16GB for 1200$, that wasn't okay just because they had more power/VRAM on the 4090 instead. And they thankfully got mocked for that and released a "Super" that was cheaper than the base model. The 5080 5060 8GB Ti is still a bad value card even if a more expensive card exists.
Absolutely 100% true, if you can choose between a 32gb 5090, a 24gb 5090 and a 16gb 5090 that would be godsend actually. I don't see how anyone can disagree with that. It's bonkers. Lot's of people would choose to save 200$ and get the 16gb instead of the 32.
Thinking about it, this is just their laptops I suppose. Their laptop 5090 is 24GB. These laptops are always weaker than the desktop GPUs but, hey, it's a 5090 by name. :laugh: The lower you go though it eventually makes no sense. An 8GB 5090 would be the world's most limited card. And it'd definitely still be above $1000.
 
Last edited:
Pressure for value, not just options. NVIDIA gave us the option to buy a 4080 16GB for 1200$, that wasn't okay just because they had more power/VRAM on the 4090 instead. And they thankfully got mocked for that and released a "Super" that was cheaper than the base model. The 5080 8GB Ti is still a bad value card even if a more expensive card exists.

Thinking about it, this is just their laptops I suppose. Their laptop 5090 is 24GB. These laptops are always weaker than the desktop GPUs but, hey, it's a 5090 by name. :laugh: The lower you go though it eventually makes no sense. An 8GB 5090 would be the world's most limited card. And it'd definitely still be above $1000.
Well sure, an 8gb 5090 would be dumb cause when you get to the 5090 class you expect to play games maxed out (and with RT), something you don't really expect with a 60 class chip.

But I'm not the one trying to remove options, everyone else is. I don't any issues with an 8gb 5060ti existing. Even if we exclude the fact that it offers great performance on every aaa game it's also the golden card for those interested in esports titles which don't even need vram (especially when you are lowering settings for higher framerate and better visibility). But people here are arguing that nope, that option shouldn't exist and these people should spend an extra 100 bucks to get the 16gb version even though they don't need the extra vram.
 
But people here are arguing that nope, that option shouldn't exist and these people should spend an extra 100 bucks to get the 16gb version even though they don't need the extra vram.
The argument is that the 8GB version shouldn't exist at the price it's at but, really, the 16GB version should also be cheaper. Also, unless you plan on playing only a select pool of low-demand games or esports titles for the next few years, you will need more VRAM. And even if it was true that you didn't need more VRAM in a few years, why then buy a $400 GPU to do something you could accomplish with a used 580 for under $100. I actually did just this not too long ago for a family member, bought them a cheap $50 4GB GPU because they needed a budget card but didn't care about modern games. Had no reason to buy them anything more expensive. I then upgraded them to a 6800 16GB later.

The overall goal is to protect consumers as well as advocate for better value in GPUs.

The same way people who don't know better about cars get scammed by mechanics or people who don't know better about fitness get conned by scam diet trends. In this case, people who don't know better about GPUs and VRAM are being taken for a ride by the 5060 Ti 8GB. The only people who should be in support of this product, at this price, are NVIDIA and their shareholders.

Edit: Adding this at the end since earlier TPU charts were referenced about this. In their own video review comparing the 8GB vs 16GB 5060 they highlight VRAM being a limiting factor in multiple scenarios. It wasn't killing the GPU all the time but having to allocate resources to DRAM was obviously hurting it.
 
Last edited:
The argument is that the 8GB version shouldn't exist at the price it's at but, really, the 16GB version should also be cheaper. Also, unless you plan on playing only a select pool of low-demand games or esports titles for the next few years, you will need more VRAM. And even if it was true that you didn't need more VRAM in a few years, why then buy a $400 GPU to do something you could accomplish with a used 580 for under $100. I actually did just this not too long ago for a family member, bought them a cheap $50 4GB GPU because they needed a budget card but didn't care about modern games. Had no reason to buy them anything more expensive. I then upgraded them to a 6800 16GB later.

The overall goal is to protect consumers as well as advocate for better value in GPUs.

The same way people who don't know better about cars get scammed by mechanics or people who don't know better about fitness get conned by scam diet trends. In this case, people who don't know better about GPUs and VRAM are being taken for a ride by the 5060 Ti 8GB. The only people who should be in support of this product, at this price, are NVIDIA and their shareholders.
Your assumption that the rx580 will get you a similar experience to a 5060ti is unwarranted though. Υou don't have to only play low demanding games on the 5060ti, every game give you a great experience assuming you don't try to play every game at mega ultra settings
 
Your assumption that the rx580 will get you a similar experience to a 5060ti is unwarranted though. Υou don't have to only play low demanding games on the 5060ti, every game give you a great experience assuming you don't try to play every game at mega ultra settings
I wasn't trying to imply it'd be the same experience, that's my bad. The 5060 Ti is still worlds more powerful but if you're a budget gamer the 580 is a budget card under $100...with 8GB VRAM...that does 30-40fps in newer titles and 60fps in older...released 8 years ago. Is this the greatest experience? No. But it's the same "just drop settings" argument, only now it makes a lot more sense.
 
Just face the facts you guys. 97% of the people posting in this thread are just too hardcore for 8GB. Give it to a kid or a teen who had a shit brick before will be the happiest kid on the block.

Go play a game instead of arguing lol.
 
Just face the facts you guys. 97% of the people posting in this thread are just too hardcore for 8GB. Give it to a kid or a teen who had a shit brick before will be the happiest kid on the block.

Go play a game instead of arguing lol.
Tempest Rising is actually a lot of fun if you are into RTS.

I don't know if they were necessarily overhyping it but I personally wasn't buying into spending £50-£200 more for an upscaler. The premium made sense if you'd use their full feature suite (including creation) but DLSS alone wasn't that convincing (imo).
No reviewer has told you that Hyper RX achieves the same thing as DLSS but in every Game.
 
Tempest Rising is actually a lot of fun if you are into RTS.


No reviewer has told you that Hyper RX achieves the same thing as DLSS but in every Game.
"same"
 
For gaming at 1440p and above at ultra, you definitely more than 8GB VRAM card. Least of what I would think is 12GB VRAM in today's gaming era. Also, if someone is into editing and animation specifically Adobe After Effects with heavy GPU dependent tasks, and rendering eats all of the RAM, 8GB card now is bare minimum now.
 
  • Like
Reactions: NSR
I am pressuring them to higher standards. But in this case, since they have both an 8gb and a 16gb option for you to choose what fits your needs, what should I pressure them about? The higher standard IS for them to offer a cheaper alternative for those that don't need 16gb and don't want to pay extra for it and nvidia delivered it.


Absolutely 100% true, if you can choose between a 32gb 5090, a 24gb 5090 and a 16gb 5090 that would be godsend actually. I don't see how anyone can disagree with that. It's bonkers. Lot's of people would choose to save 200$ and get the 16gb instead of the 32.
I can disagree with that because it just creates e-waste and increases prices across the board. Decreasing the number of SKUs and making proper decisions during the design phase (like not having to clamshell memory on a budget card) reduces the cost for everyone.

Nobody's paying $3600 for an 8GB card, and no a 16GB 5090 would not be a "godsend" this is just delusional.
 
Last edited:
For gaming at 1440p and above at ultra, you definitely more than 8GB VRAM card. Least of what I would think is 12GB VRAM in today's gaming era. Also, if someone is into editing and animation specifically Adobe After Effects with heavy GPU dependent tasks, and rendering eats all of the RAM, 8GB card now is bare minimum now.
I wouldn't say that 8 gigs are the bare minimum, but yes, if you want a pc for something else than playing games at high settings you'll need a gpu with at least 12, but at that point someone could argue that even that isn't enough. Not everyone uses their pcs for the same tasks.
 
Just face the facts you guys. 97% of the people posting in this thread are just too hardcore for 8GB.
Exactly! We are not the target audience.
For gaming at 1440p and above at ultra, you definitely more than 8GB VRAM card.
Thank you for reiterating what everyone knows. Oh, and adding to your statement: For gaming at 2160p ultra, you definitely more than 12GB VRAM card, but hey, let's not nitpick, right?
 
For gaming at 1440p and above at ultra, you definitely more than 8GB VRAM card. Least of what I would think is 12GB VRAM in today's gaming era. Also, if someone is into editing and animation specifically Adobe After Effects with heavy GPU dependent tasks, and rendering eats all of the RAM, 8GB card now is bare minimum now.
For gaming at 1440p (not even above that) and ultra you need a 5090. A 4090 might barely cut it. Ultra settings is not the domain of 400$ gpus anymore.
 
Status
Not open for further replies.
Back
Top