• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are the 8 GB cards worth it?

Would you use an 8GB graphics card, and why?

  • No, 8 GB is not enough in 2025

  • Yes, because I think I can play low settings at 1080p

  • I will explain in the comments section


Results are only viewable after voting.
Status
Not open for further replies.
Pushing a 8gb card at the price it is right now is pushing bad interest, maybe the one writing about gatekeeping should keep itself in check.
 
Not it's not, it would just use 630 watts.

I'd say ~30 W more than the 16 GB version to feed the extra 8 memory chips on the underside. I don't expect a repeat of the RTX 3090 (aka 100 W on the MVDDC rail because of the 24 hungry mem chips). Advertised TGP would likely be the same, so you could very well see the 16 GB model being faster in most scenarios. Hence, utterly worthless. If you need more than 16 GB VRAM right now, then your needs also can't be met by this tier of hardware IMHO.

The only thing a 32 GB 9070 XT would be good for is to drain available stock and divert it straight to the AI people and whoever's left mining. Really.
 
I think we can all agree that the 5060Ti and its variants are not enthusiast cards.

Good job on restraining yourselves btw.
Its a turd.

Mic drop.


On another note.

Pushing a 8gb card at the price it is right now is pushing bad interest, maybe the one writing about gatekeeping should keep itself in check.
 
I'd say ~30 W more than the 16 GB version to feed the extra 8 memory chips on the underside. I don't expect a repeat of the RTX 3090 (aka 100 W on the MVDDC rail because of the 24 hungry mem chips). Advertised TGP would likely be the same, so you could very well see the 16 GB model being faster in most scenarios. Hence, utterly worthless. If you need more than 16 GB VRAM right now, then your needs also can't be met by this tier of hardware IMHO.

The only thing a 32 GB 9070 XT would be good for is to drain available stock and divert it straight to the AI people and whoever's left mining. Really.
lol I just though about that

RX 9080 XTX 666 watts Satan edition.

Just wonder where you saying a 96 compute, 6144 shaders, 192 tensor cores, 96 RT cores card, or 128 compute, 8192 shaders, 256 tensor cores, 128 RT cores?
My though was the later.
 
lol I just though about that

RX 9080 XTX 666 watts Satan edition.

Just wonder where you saying a 96 compute, 6144 shaders, 192 tensor cores, 96 RT cores card, or 128 compute, 8192 shaders, 256 tensor cores, 128 RT cores?
My though was the later.
9080 Hellstone Ultimate Edition.
But really though, a 9080 isn't necessary since the 9070XT is a fast card with more than enough VRAM, and in the current market a 9080 would be over $1000.
 
9080 Hellstone Ultimate Edition.
But really though, a 9080 isn't necessary since the 9070XT is a fast card with more than enough VRAM, and in the current market a 9080 would be over $1000.
16GB is “more than enough”. I’ll have to remember this.
 
In the US, the average farmer exists on subsidies, and has a gross profit margin of 12-15%. If I do the math, Nvidia is telling me they are 5 to 6.25 times as important as the average farmer.
First you're confusing gross and net profit margins: NVidia's net margin is about 55%, and the margin on their consumer cards is less than half that: perhaps 25%. Historically their margins have been much lower -- in the 12% range a decade ago, and negative (losing money) five years before that. And a decade from now, these high margins will have attracted so much new competition that NVidia will again be back to those low levels. And while the average farmer has a net margin of about 12%, the most profitable farms have operating margins that top 50%:


But your biggest error is the belief that margins are somehow dictated by 'importance', rather than a whole host of factors you should have learned in microeconomics. Since NVidia must amortize bilions in R&D costs, they could make considerably more profit by lowering the price and selling more cards -- if they could get the capacity from TSMC. Since they can't, lowering the price would simply revert us to the situation in 2021, where artificially low prices led to product shortages, outages, and widespread scalping.

That registers to me as a gigantic middle finger when they want to charge (429-379) $50 for less than 10 dollars worth of DRAM chips.
Again, this reveals a deep misunderstanding not just of the GPU market, but of manufacturing in general. The sole difference between these two cards is *not* "$10 in DRAM chips". The 8GB variant has its own design costs and manufacturing run startup costs, costs to design, order, and print boxing that, while nearly identical, isn't. Many countries have per-product registration and certification costs, and it'll require separate inventory, tracking, and -- to a more limited degree, sales and marketing. And since the 8GB variant will sell in lower volume than the 16GB, all those fixed costs must be amortized against a smaller number of units. The gross margins must be set higher.

But even if this wasn't true, it ignores reality, and a concept known as equilibrium price. NVidia has only a certain amount of GPUs. If it sets a $10 differential between the 8GB and 16GB cards, then everyone wants the larger card -- the shelves run bare, while 8GB boxes gather dust. What then? Scalpers step in and start reselling the 16GB for a higher price, while desperate retailers begin selling the 8GB at below MSRP. And then guess what? You wind up with the exact same situation -- a large price gap between the two cards. In fact, economics theory predicts that, due to deadweight losses, that gap will be even larger than if NVidia did it what it's doing now, and attempting to price the models so that both models sell to meet production -- but only just.
 
Inconsequential, the 5090 has more L2 in it than all tiers of cache on the 9070 XT combined, in addition to the twice as wide bus and fast GDDR7 memory.

Radeon RX 6950 XT has more Infinity Cache (128 MB) than all tiers of cache on the RTX 5090 (117.76 MB).
Also, RX 9070 XT is a small chip, only 48% of that in RTX 5090. Completely different product tiers.
The first is made for the gamers, the second is made for the black-leather e-peen, and his social status with super expensive yachts, private jets, Ferraris/Lamborghinis and all else that makes no sense in the real world.

It offers the best of both worlds.

It doesn't, because its extremely high price. It's not worth.
 
It doesn't, because its extremely high price. It's not worth.
It isn't worth the cost given all the issues 50 series is having either.
16GB is “more than enough”. I’ll have to remember this.
It is more than enough for the price point, the 9070XT is 90-95% as fast as a 5070Ti, with $150 less MSRP, and no melting power connector or awful crashing drivers.
But I know you want to take my post out of context and that it is difficult for some Nvidia users to ever accept AMD being better value.
and the margin on their consumer cards is less than half that: perhaps 25%
I'm going to have to press X doubt on that one. Nvidia likely has healthy margins, especially since they shifted the stack around since the RTX 30 series,and since die sizes keep getting smaller relative to the product tier while Nvidia only keeps charging more. Also if consumer margins were low they wouldn't be making consumer cards at all, no point when consumer sales only account for 6-7% of revenue.
situation in 2021, where artificially low prices led to product shortages, outages, and widespread scalping.
The situation which Nvidia had significant contribution to by selling products directly to mining farms while not doing anything to deter scalpers.
NVidia has only a certain amount of GPUs.
So, even more of a reason for the 8GB version to not be made at all, or the only 8GB card should be the 5060 non-Ti, with the 5060Ti being made out of defective 5070 chips with a 192bit bis width.
 
Last edited:
Radeon RX 6950 XT has more Infinity Cache (128 MB) than all tiers of cache on the RTX 5090 (117.76 MB).
Also, RX 9070 XT is a small chip, only 48% of that in RTX 5090. Completely different product tiers.
The first is made for the gamers, the second is made for the black-leather e-peen, and his social status with super expensive yachts, private jets, Ferraris/Lamborghinis and all else that makes no sense in the real world.

It doesn't, because its extremely high price. It's not worth.

Yeah, it does. It's also slower than the 5070 (least it had to do, be marginally faster than a 5 years old flagship). Your point being? And who are you to decide whether something is worth it to someone or not, exactly?

We're literally talking about a purported professional variant of the 9070 XT. You said it yourself, there is no possible way it's keeping up. So, if faced with a 32 GB N48 design or a 5090 in the same price range, and you pick the AMD card, you either have a very specific requirement, or you're a complete fool. Thanks for proving my point.

It's hilarious how you mean to lecture me telling me that the card doesn't need bandwidth, because it had cache... all to result in a self-own for the ages. Stay classy.
 
After 26 pages the conclusion is pretty clear, 8gb of vram is enough to offer a good high end experience to people that actually have and use 8gb cards, but it's not enough for people that don't have an 8gb card and just watch reviews about them.
 
After 26 pages the conclusion is pretty clear, 8gb of vram is enough to offer a good high end experience to people that actually have and use 8gb cards, but it's not enough for people that don't have an 8gb card and just watch reviews about them.

Pretty much. Thread has gone full circle so many times over it debased into yet another flame war. Sad.
 
Pretty much. Thread has gone full circle so many times over it debased into yet another flame war. Sad.
The funny part is that even if we all agree that 8gb is not enough for high end gaming anymore (which is factually wrong, but just for the sake of argument) I still haven't heard any compelling arguments of why an 8gb 5060ti (or the 9600xt amd card) shouldn't exist. Since you are given a 16gb option, that should solve any complaints, you can just freaking buy the 16gb version. But nope, we HAVE to complain, for absolutely 0 reason. I get HUB doing it, he is generating traffic with his fake outrage, but why are we end users participating in this clown fiesta is beyond me.
 
Yeah, it does. It's also slower than the 5070 (least it had to do, be marginally faster than a 5 years old flagship). Your point being? And who are you to decide whether something is worth it to someone or not, exactly?
I think the point is the opposite - a bit similar to mine, in fact. Being a PC hardware enthusiast is not measured by whether one considers it worth spending the price of a used car on a GPU alone or not. We all love hardware here, but we all have different needs and levels of comfortable spending limits, which should be appreciated, not frowned upon. Spending should be a personal choice, not an expectation to gain access to some sort of elite club.
 
I think the point is the opposite - a bit similar to mine, in fact. Being a PC hardware enthusiast is not measured by whether one considers it worth spending the price of a used car on a GPU alone or not. We all love hardware here, but we all have different needs and levels of comfortable spending limits, which should be appreciated, not frowned upon. Spending should be a personal choice, not an expectation to gain access to some sort of elite club.

I think you are giving it a lot more credit than it was ever worth. Enthusiast by definition is someone who is enthusiastic - interested - engaged - concerned with any given thing. Otherwise that exchange would not really have happened. It's long since debased into "my favorite brand rulezzz", after all, an orange, a lemon and a tangerine may all be citrus, but they aren't the same fruit.
 
Enthusiast by definition is someone who is enthusiastic - interested - engaged - concerned with any given thing.
Exactly, thank you. :)
 
Back in end of 2023, when I was still running two 1080p monitors and a poor 3070...
- In Forza Motorsport 8 no matter the other settings, high texture leads to frequent VRAM related frame drop no matter how low I go with every other settings. Medium texture in FM8 is horrendous for that matter. And no, the in-game benchmark doesn't show that.
- The Crew Motorsport frequently run out of VRAM on a high everything setting. And no, the in-game benchmark doesn't show that either.
These (and Forza Horizon 5) were the only heavy games I seriously played at the time. I didn't put Ultra settings or RT related stuff on these games. The 3070 is one of

I had plans to go 4K at that time anyway, but even if I stayed at 1080p, it's obvious enough to me that the 3070 and 8GB VRAM cards in general doesn't have much life left in a 1080p high settings. If only the 3070 had a bit more than 8GB, like 10GB ...

While I can blame bad optimization in general, game devs are not going to dial back on VRAM usage, and more and more games will be heavily compromised by 8GB VRAM.
(FM8 did get a bit better later if the patch notes didn't lie, but it's after I upgraded the displays and GPU. In comparison FH5 can keep 45~60fps on a borrowed 1660Super with 6GB VRAM at 4K high. Yes I have really tried it. Motorfest is, well, Ubisoft being Ubisoft.)

Now I do want that 3070 or a similar 8GB card back, but as a spare that is just fast enough to run games at a reasonable fps no matter the resolutions and settings. Considering the 3070 costed me USD$550 equiv. as a new card in early 2021, personally I would only pay at most USD$250 equiv. for that, be it a used or new card. Not that I'm actively looking for that.

After 26 pages the conclusion is pretty clear, 8gb of vram is enough to offer a good high end experience to people that actually have and use 8gb cards, but it's not enough for people that don't have an 8gb card and just watch reviews about them.
What a sad take. I'm sure there are still lots of people that is satisfied with whatever their 8GB cards can offer, and maybe you are one of them, but why do you think there are absolutely no one that run out of 8GB VRAM on 1080p? There wouldn't be that much noise if there aren't a significant enough portion of the userbase having trouble. I have experienced some of that in 2023.
Having takes like that makes me feel like I'm a guy getting screwed by frauds, yelling at clouds and no one is listening. Not that I'm actually screwed but.
 
What a sad take. I'm sure there are still lots of people that is satisfied with whatever their 8GB cards can offer, and maybe you are one of them, but why do you think there are absolutely no one that run out of 8GB VRAM on 1080p? There wouldn't be that much noise if there aren't a significant enough portion of the userbase having trouble. I have experienced some of that in 2023.
Having takes like that makes me feel like I'm a guy getting screwed by frauds, yelling at clouds and no one is listening. Not that I'm actually screwed but.
I never said there is absolutely no one that runs out of vram. I'm saying running out of vram doesn't make a card useless, it just means you have to lower settings. Just like you do when you run out of raster performance, rt performance etc. There were cards launched at 1200$ (6950xt) that ran out of rt performance day one and had to drop settings or else they were getting single digits fps. Nobody called that card doa, they just accepted that they have to turn down some settings.
 
8gb of vram is enough to offer a good high end experience to people that actually have and use 8gb cards
No it isn’t, unless you count dropping to medium texture as high end experience.
Sure there might be games with great “medium” texture but it is a dodgy bet that nobody should count on. On FM8 it is not. Definitely not. (FM8 doesn’t have an ultra texture, so at that time it is stuck at an admittedly great but heavy “high” setting or a horrendous, early PS4-like “medium” setting.

The 3070 had failed to deliver a good experience in 1080p high (and I have to repeat, non-RT non-ultra) for games back in late 2023 only because of the dreaded 8GB VRAM limitation. And there are quite a few people on other places that were failed by 3070(Ti).

And no, the 3070 was not DOA. It was a great card for what it was in 2021. But the game devs are not kind to the 3070.
 
No it isn’t, unless you count dropping to medium texture as high end experience.
Sure there might be games with great “medium” texture but it is a dodgy bet that nobody should count on. On FM8 it is not. Definitely not. (FM8 doesn’t have an ultra texture, so at that time it is stuck at an admittedly great but heavy “high” setting or a horrendous, early PS4-like “medium” setting.

The 3070 had failed to deliver a good experience in 1080p high (and I have to repeat, non-RT non-ultra) for games back in late 2023 only because of the dreaded 8GB VRAM limitation. And there are quite a few people on other places that were failed by 3070(Ti).

And no, the 3070 was not DOA. It was a great card for what it was in 2021. But the game devs are not kind to the 3070.
Bingo. If the 3070 can only deliver PS4 textures due to 8gb of vram then CLEARLY something is wrong with the game in question. How is it possible for textures to be getting worse while using similar amount of vram?

Games looking worse while using similar resources is a game issue, especially when we are talking about a closed circuit racer. Those are considered super light games.
 
and the margin on their consumer cards is less than half that: perhaps 25%
I'm going to have to press X doubt on that one. Nvidia likely has healthy margins... if consumer margins were low they wouldn't be making consumer cards at all
Then you don't follow the market, nor have you read their financial statements. The last time I looked, NVidia was earning 9% of revenue from the gaming sector, but only 5% of profits, indicating margins of about half other sectors. Countless industry analysts have done analyses to indicate NVidia's margins on their gaming chips are *much* slimmer than what they earn on Datacenter/AI. NVidia's skyrocketing margins over the last few years have come entirely from this latter sector.

And what sort of backwards logic is "if margins were low they wouldn't be making consumer cards at all"? 5% profit is better than nothing. I guarantee you NVidia has an entire army of financial analysts, all using their business calculus to determine the perfectly optimal allocation of their manufacturing capacity between their various market segments.

The situation which Nvidia had significant contribution to by selling products directly to mining farms while not doing anything to deter scalpers.
This is beyond inane. In a free-market economy, what do you suggest NVidia do to "deter scalpers". Throughout most of the USSR's history, the Soviets sentenced black marketers to 10-year prison terms in Siberian gulags, and even *they* couldn't stop scalping. Should NVidia hire private investigators and submachine-gun toting enforcers? I can imagine the monumental levels of outrage from you if NVidia attempted to sue some basement-dwelling gamer for reselling his new card at a profit.
 
I never said there is absolutely no one that runs out of vram. I'm saying running out of vram doesn't make a card useless, it just means you have to lower settings. Just like you do when you run out of raster performance, rt performance etc. There were cards launched at 1200$ (6950xt) that ran out of rt performance day one and had to drop settings or else they were getting single digits fps. Nobody called that card doa, they just accepted that they have to turn down some settings.
Running out of vram isn't the same as having to turn off ray tracing, the 6950XT is a 3 year old card that is still as fast as a 5070. But the 5060Ti is a card marketed for RT and high performance gaming, for $400 and with some models being near $500 I don't consider it acceptable to have to turn down settings out of the box, and the card will be even slower or won't run games at all as an 8GB vram buffer is starting to not be enough.
Bingo. If the 3070 can only deliver PS4 textures due to 8gb of vram then CLEARLY something is wrong with the game in question. How is it possible for textures to be getting worse while using similar amount of vram?

Games looking worse while using similar resources is a game issue, especially when we are talking about a closed circuit racer. Those are considered super light games.
Or maybe devs only have the time to optimize a game so much, or game optimization can only go so far with 8GB of VRAM before textures start to look muddy.
It is more like when consoles have more allocation for VRAM you know something is wrong with the market and how badly nvidia is deceiving its customers into thinking 8GB is still enough let alone charging $400 for it.
Then you don't follow the market, nor have you read their financial statements. The last time I looked, NVidia was earning 9% of revenue from the gaming sector, but only 5% of profits, indicating margins of about half other sectors. Countless industry analysts have done analyses to indicate NVidia's margins on their gaming chips are *much* slimmer than what they earn on Datacenter/AI. NVidia's skyrocketing margins over the last few years have come entirely from this latter sector.

And what sort of backwards logic is "if margins were low they wouldn't be making consumer cards at all"? 5% profit is better than nothing. I guarantee you NVidia has an entire army of financial analysts, all using their business calculus to determine the perfectly optimal allocation of their manufacturing capacity between their various market segments.
I'm not an Nvidia shareholder, I couldn't care less about their financial statements, whether its 5% or 9% it is quite obvious the gaming sector has taken the lowest level in priority for their AI money printing machine, a company built off the backs of gamers now left with barely any effort on the hardware to the point of cards missing ROPs, melting power connectors, and very little QA on drivers.
This is beyond inane. In a free-market economy, what do you suggest NVidia do to "deter scalpers". Throughout most of the USSR's history, the Soviets sentenced black marketers to 10-year prison terms in Siberian gulags, and even *they* couldn't stop scalping. Should NVidia hire private investigators and submachine-gun toting enforcers? I can imagine the monumental levels of outrage from you if NVidia attempted to sue some basement-dwelling gamer for reselling his new card at a profit.
And this kneejerk reactionary hyperbolic defense for Nvidia is what I expected, and its the same nonsense people used back in the mining hype days. Nvidia controls the supply to AIB's, I'm saying they could have done something about it. Ah yes, and the "but but free market" excuse, Nvidia did get sued and fined for lying about where the GPU's were going but it was only a slap on the wrist.
 
Literally any middle class worker can afford a 5090, they're just not interested since they aren't computer enthusiasts. Hence the 4090 being more popular than any Radeon dedicated GPU in steam stats, despite it still being a minority overall, because most people using steam aren't pc enthusiasts, they're just gamers.


Sure, so you will continue to pretend you don't know what the term refers to. Ok. I guess AMD knows better.

The problem with that title is that there are plenty of enthusiasts buying 9070XT. The market has made it high end. If you think that average people are buying 5090s for play, You have no idea what has been going on in the economy. The 5090 will bought by Countries to implement their AI platform and also companies. In fact when you think about the 5090 is an Enterprise class card in terms of price. Then you use the 4090 as a reference when a large percentage of 4090s were people selling theirs to get 5000 series. Do you forget the article when Nvidia was restricting stock about 8 months ago? The most frustrating thing in people who insert the 5060 and it's $400 MSRP as the low end , when there are faster options from AMD with more VRAM.
 
After 26 pages the conclusion is pretty clear, 8gb of vram is enough to offer a good high end experience to people that actually have and use 8gb cards, but it's not enough for people that don't have an 8gb card and just watch reviews about them.
i wouldn't call it a "high end experience" but like i said before, 8 gigs are kinda enough for what i do with my pc.
 

Attachments

  • Indiana Jones And The Great Circle Screenshot 2025.05.01 - 16.59.48.57.png
    Indiana Jones And The Great Circle Screenshot 2025.05.01 - 16.59.48.57.png
    4.6 MB · Views: 34
  • Indiana Jones And The Great Circle Screenshot 2025.05.01 - 16.45.44.73.png
    Indiana Jones And The Great Circle Screenshot 2025.05.01 - 16.45.44.73.png
    4.6 MB · Views: 46
Status
Not open for further replies.
Back
Top