• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are the 8 GB cards worth it?

Would you use an 8GB graphics card, and why?

  • No, 8 GB is not enough in 2025

  • Yes, because I think I can play low settings at 1080p

  • I will explain in the comments section


Results are only viewable after voting.
Status
Not open for further replies.
Im not taking the steam hardware survey as gospel cause I don't know what methodology they are using.
You don't know their methods, but you expect me to believe in it. We're stepping into religious territory here.

Im saying that all experts in the field of statistics accept random sampling as one of the most accurate methods.
Why though? You just said yourself that you don't know what methodology the Steam survey uses. So why should we all accept it as the true and tested method?

Which is self evident anyways, simple statistics would tell you that it's much more unlikely that your random sample isn't accurate. You walk into a room with 1000 people, you sample 100 of them and 90% have an nvidia GPU. It's incredibly unlikely (like world altering unlikely) that the actual marketshare in that room is 50-50 and you somehow managed to randomly select 9 out of 10 having nvidia. It's more likely youll win the lottery and get hit by lighting while on your way to collect the money.
If 9 out of 10 people have an Nvidia GPU in the room, and you select only 10 people out of the 100 for the survey, then there is a much higher chance of you selecting 10 Nvidia owners than a mixed sample.
 
Guy is literally showing games side by side on the same card with different framebuffers... It's not valid because you don't approve of the settings or what?
No, it's invalid because those results do not match a number of other sources. Again...
Either he's being incompetent, which would not be a shocker, or he's deliberately skewing the results which would make him a fraud. I don't know which and don't care. HUB is a waste of time, can't be trusted and isn't worthy of even a single moment of consideration.
...thus.
 
Of course they are. That's what low settings are. TLOU 2 for example runs on a 5500xt and a 4core cpu from 8 year ago at low settings. How is that possible if "devs don't spend time downgrading". They always have been and they always will, that's why graphics options exist in the game menu.
Low settings, on a $400 GPU in 2025, do you not realize how ridiculous that sounds?
And yes you can keep running games on a 5500XT, the point is buying a new GPU today which costs the same as a PS5 or Xbox to settle running games at the same performance of a GPU from 5 years ago is a stupid way to spend money. There has been very little progress on the xx60 tier of cards and it's sad that people defend companies charging more for the same amount of VRAM.
There are 0 games right now that need at least 10-12 gb of vram.
There are, see the Hogwarts Legacy screenshot posted, the 16GB card is using 10GB.
Im not taking the steam hardware survey as gospel cause I don't know what methodology they are using. Im saying that all experts in the field of statistics accept random sampling as one of the most accurate methods. Which is self evident anyways, simple statistics would tell you that it's much more unlikely that your random sample isn't accurate. You walk into a room with 1000 people, you sample 100 of them and 90% have an nvidia GPU. It's incredibly unlikely (like world altering unlikely) that the actual marketshare in that room is 50-50 and you somehow managed to randomly select 9 out of 10 having nvidia. It's more likely youll win the lottery and get hit by lighting while on your way to collect the money.
You said the "experts" have already decided the Steam hardware survey is accurate. But just who are these experts? You don't know what method Steam is using so the survey can't be considered accurate.
Random sampling when you walk into a room and pick 100 people isn't accurate, which is why everyone should be selected for the survey. Even if 90% of people are using an Nvidia GPU, I want to know what the methodology is and how many people were asked to participate in the survey.
Except that HUB's data doesn't match everyone and given their history of bad numbers and benchmark results, they simply can't be trusted.

So can we stop talking about those fools?
By everyone you mean TPU, because you've been dismissive of every other review someone has posted showing the 8GB version performing worse than the 16GB card.
 
No, it's invalid because those results do not match a number of other sources. Again...

...thus.
He's for sure putting settings that is stressing the framebuffer - but he says that in the video, his results are real though -- and they're a real glimpse into the future.

The concept of having "barely enough ram" shouldn't be difficult to grasp for anyone here. If you want to replace your card in 2 years, and really love asset pop-in, then get 8gb.

This is how it currently works for AAA and even some AA titles:
Game is developed for Xbox and PS5 on 16GB unified - typically using UE5 w/nanite and fitting as much as they can to make the world as large/immersive as possible.
Then they tweak asset streaming and downscaling to make it fit on 8gb so then it can be ported down to PC and Xbox series S where it runs like crap.

You have the occasional exception, but let me tell you - stalker 2 running on my 8GB 4060 is a bit rough, doable but defintely struggles.
 
Last edited:
By everyone you mean TPU
No.
because you've been dismissive of every other review someone has posted showing the 8GB version performing worse than the 16GB card.
No, I have not. YOU have not been paying attention.

He's for sure putting settings that is stressing the framebuffer - but he says that in the video, his results are real though -- and they're a real glimpse into the future.

The concept of having "barely enough ram" shouldn't be difficult to grasp for anyone here. If you want to replace your card in 2 years, and really love asset pop-in, then get 8gb.
As has been said many times in this and a few other threads where this very discussion has happened, no one buying these cards are going to run games on max/ultra settings. It's a strawman argument at best. If a game is hitting the 8GB limit, turn some settings down and bob's your uncle. I have encountered this problem a couple of times with my 3080 running on my living room PC and 4k TV. Turn a few settings down and problem solved. No big deal. 10GB is still enough and so is 8GB.
 
Last edited:
As has been said many times in this and a few other threads where this very discussion has happened, no one buying these cards are going to run games on max/ultra settings. It's a strawman argument at best. If a game is hitting the 8GB limit, turn some settings down and bob's your uncle. I have encountered this problem a couple of times with my 3080 running on my living room PC and 4k TV. Turn a few settings down and problem solved. No big deal.
You bought your 3080 almost 4 years ago (still an awesome card). Very recently games are coming out primarily targetting more vram, so just now games are starting to downscale textures (no high texture pack in FC6 for instance) on 10GB.

Now -- combine the fact that it takes like $20-$30 in manufacturing cost to go from 8gb to 16gb if using GDDR7 chips - and to the title of this thread /poll -- is 8GB enough in 2025 -- for a new card?

Are you going to roll over and die with an 8GB card? - probably not.

But that building that just popped in 20 feet in front of you randomly on microsoft flight simulator might be a problem. Look at Dragons Dogma 2, Stalker 2, Indiana Jones and the Great Circle, spider man 2 vram usage and you can see they're going to be selective about what's getting streamed in / downscaled as you play.
1746306351503.png


Also now with Frame Gen and upscaling the GPUs - even the 5060 is powerful enough to take advantage of the 16GB buffer, so it makes sense to go more.

A good analogue is the 1060 3GB vs 6GB. That 1060 6GB ended up being the number 1 card on steam for 3 YEARS -- the 3GB barely had enough ram when it came out, and 2 years later barely anyone was using them.
 
Last edited:
Random sampling is inaccurate just as vaccines cause autism, the moon landings were faked, and the earth is a flat disk, carried through space by a gigantic tortoise. Have they stopped teaching statistics entirely in public schools?

The problems with the Steam survey is that it is not fully random; it's a sampling only of those who opt in among its own user base. This causes a significant degree of sample bias if you're attempting to measure "the PC community at large". If you're measuring the enthusiast gamer market, whom (it is my understanding) most of which utilize Steam, its significantly more accurate. Even here though, the opt-in requirement causes a small but perceptible bias towards higher-end systems.

You...really shouldn't try to explain statistics. It's much deeper than you give it credit for, and your statements are patently wrong on most levels, while only hitting the very frailest definition of right.

Let's take the bad argument of "random sampling is wrong" and dig in. Let's do it with some basic numbers, acknowledging that they are much too small but will be tolerable so I don't have to whip out huge numbers. You have a sampling size of 100 respondents. The actual size of potential people that exists is 1000. You extrapolate data from the 100, is it accurate? My hint here is that the answer is not no. There is a question whether the sampling size accurately represents the whole, and there is a question of whether the sample taken is truly random.


The answer here is that the 100 sampling size isn't representative of the 1000 exactly. Assuming you are truly choosing at random, there's no problem in stating THE MARGIN OF ERROR, which is an extrapolation of how much bias goes in. Funny that you'd use the steam survey...because their data is expressly shown as percentages instead of quantities...almost like any good statistician would immediately dismiss this as capable of any concrete numbers because they could either be 1000 people or 1000000....but most people do miss that LOW BAR.

You then bring up a bias for higher end systems, as much as can be expected from a voluntary survey. Quantify. Oh right, you cannot because you have no merit is pulling that out of your backside supported by percentage data. Cool. So...your data is garbage, we cannot calculate a margin of error, what do we do? Right, in your magical situation we just throw out data.



Let's answer your premises in order then, for the TL;DR crowd:
1) Random sampling isn't random sampling when you have % of sample data and no way to quantify margin of error with no known sampling size. You fail math, it didn't fail you.
2) Autism was reclassified as autism spectrum disorder at the same time as the anti-vaccination movement came about. Again, when your qualifier open up wide in conjunction with a non-scientific movement it's really easy to prove an uptick in AUTISM SPECTRUM DISORDER coinciding with vaccination increases. Correlation is not the same as causation, and humans are irrationally protective of their offspring because of our evolutionary path. Trying to explain this to a mother trying to shield her offspring is silly, because the 1% issue rate (and I am way over estimating this Vaccinations) framed as possible overrides the 99% chance of not hurting the offspring, because if they are that 1% they'll scream at everyone just like the anti-vax crowd did.
3) No comment.
4) This...is not Disc World. It's funny that you want to make that claim...but more than 2335 years later people still believe. My counter to your argument is that today there are benchmarks that demonstrate that 8GB isn't enough VRAM for applications...so as much as you want to tar and feather your opposition for a view you disagree to you might also want to check whether you are standing upon facts or assumptions which are based upon historic truths.


Statistics sucks. Please stop pretending like you understand it, because I will guarantee somebody smarter than you already had the same idea and either has a postulate or mathematical calculation to address your simplification. Confidence level, and margin or error, make statistics as fundamentally solid as any other discipline assuming that you express these values and do the basic calculations. Just because you're ignorant of the solution doesn't mean there isn't one. This is why statisticians exist...and why they are some of the people who could greatly benefit from Nvidia not cheaping out on VRAM.

Edit:
I see your point - I just see no purpose in blaming either the card or the game. If you want to play TES4 Remake, you'll have to make do with whatever settings give you a performance you're comfortable with.


The Steam survey is flawed for being random. It is not an accurate measure of market share by any means.

Sure, that's why it's so consistent month after month. Random sampling is standard for industries and provides accurate measures in every other field, but in PC hardware it's simply not reliable...

And other copes.

Neither of you guys seems to get the core issue. Pay attention to the numbers. What don't you see?

.
...
.....

That's right, sampling size. How many did Valve query? How many pieces of hardware were represented? You've got no answers, because there's no numbers. There are percentages.
If you could say that 4.6% of people ram on 4060 mobile GPUs, and that represented 1,000,000 surveyors, and you extrapolated the population size at about 6,000,000, it'd be a pretty dang good representation of the entire group.
If instead 4.6% represents 1,000 or a 6,000,000 size the margin of error would invalidate conclusions. So...the problem is that Valve intentionally obscures data and you are extrapolation based on bad inputs. If I was trying to be funny I'd joke that this is the exact issue that we see with interpolative frames, but I feel you might not have a big enough buffer to store that data.
ba-dum-tish (drum roll noise)

-Edit end-
 
Last edited:
Mhhm. My Gigabyte RTX 3060 OC Gaming (the One with 3 Fans and a big Air Cooler) goes up to 2100mhz. Stock is 1700mhz. 19 Percent more then Stock. With that more Raw-Data-Power it should be slightly faster then a 4060 in Raw Data Power and it has 4GB more Ram. People told me that i should buy a 4060 and avoid Nvidia Ampere in 2025. But in Indiana Jones the 3060 is incredible fast even WITHOUT the Overclocking. Not bad that they still sell this Card for under 300 Euro brandnew. Its okay for me. I'm happy and i know, that is something strange. But they still produce the 3060 with 12GB in April 2025.
 
I have also seen people say that the 9070XT is faster than the 7900XT.
Are you for real? It's not even particularly close. The 9070XT is ~10% faster than the 7900XT on average.
1746306569581.png

If you meant the 7900XTX, that's on you.

There are 0 games right now that need at least 10-12 gb of vram.
If by that you mean 'it's playable on 1080p medium settings', sure. If you're aiming at 1440p or high settings, much less both, you'll want more than 8GB of VRAM for more than 0 games.
 
Quite obviously you care even less about the accuracy of your own statements. NVidia's margins on gaming cards are much slimmer than their other offerings; this is simple fact.

Nvidia doesn't provide margin or profit data on just it's consumer cards. If you have the source for your statement here please provide it.

That's really your entire gripe, isn't it? You're bitter and feeling jilted? NVidia shows exactly as much loyalty to you as you -- a presumably rational consumer -- do them. You purchase the best product for your needs at the best price, and they attempt to maximize profits by producing products people will purchase. You may feign indignance over that self-interested relationship, but, to paraphrase Winston Churchill, it's the worst possible system -- except for everything else we've ever tried.

We know how the market mechanically works at a broad scale, it doesn't change the fact that customers are getting royally screwed and that these companies are allowing and profiting from it. As you pointed out, that's the way the system but that doesn't make it right.

For example, if someone buys out all the drinkable water in a town right before a disaster and their response to when they resell it at a 1000x markup is explaining how supply and demand works. That doesn't change that they are the POS screwing them over, or mitigate any of the people's rightful gripes.

At the end of the day pointing at the system as the "culprit" here is ignoring the very real cause and any gripes being brought up here. The system can be good and we should not excuse bad actors.

Then name it. What could they have done to control scalping? Be specific.

Nvidia already demonstrated a method with the priority queue system (and EVGA with it's queue system in the past as well). Nvidia could expand this system further by allowing retailers to tap into.

Mind you I don't think it's up to customers to fix corporations problems for them. It would be a different matter if we saw retailers and Nvidia trying different things out but it took 3 generations of GPUs with massive shortage issues for Nvidia to even start trying one thing. Excluding EVGA of course, the only AIB that actually cared.

The problems with the Steam survey is that it is not fully random; it's a sampling only of those who opt in among its own user base. This causes a significant degree of sample bias if you're attempting to measure "the PC community at large". If you're measuring the enthusiast gamer market, whom (it is my understanding) most of which utilize Steam, its significantly more accurate. Even here though, the opt-in requirement causes a small but perceptible bias towards higher-end systems.

That and it's issues with re-sampling multi-user machines. Net cafes for example. The survey sometimes has 10% swings in language share among steam users in a single month, which makes zero sense.
 
You don't know their methods, but you expect me to believe in it. We're stepping into religious territory here.
Me? I asked you believe the steam charts? When did that happen?

Why though? You just said yourself that you don't know what methodology the Steam survey uses. So why should we all accept it as the true and tested method?
But I never said you should trust them. I said I don't know what methodology they are using. But if they are using random sampling, that's accurate.

If 9 out of 10 people have an Nvidia GPU in the room, and you select only 10 people out of the 100 for the survey, then there is a much higher chance of you selecting 10 Nvidia owners than a mixed sample.
Exactly, that's pretty accurate.

There are, see the Hogwarts Legacy screenshot posted, the 16GB card is using 10GB.
You can't be serious...hogwarts uses 21 gb on my 4090. Does that mean it NEEDS them to play the game?


He's for sure putting settings that is stressing the framebuffer - but he says that in the video, his results are real though -- and they're a real glimpse into the future.

The concept of having "barely enough ram" shouldn't be difficult to grasp for anyone here. If you want to replace your card in 2 years, and really love asset pop-in, then get 8gb.
And how is his methodology any different than me just grabbing a 4090 and putting settings that would stress it's performance, and then concluding that the card is crap and can't play modern games? Cause off the top of my head I can come up with at least 5-6 games that the 4090 drops to ~single digits or 20 fps. How can the conclusion of my little experiment be "just buy a 5090" instead of "just lower the damn settings"?
 
Last edited:
Me? I asked you believe the steam charts? When did that happen?


But I never said you should trust them. I said I don't know what methodology they are using. But if they are using random sampling, that's accurate.


Exactly, that's pretty accurate.


You can't be serious...hogwarts uses 21 gb on my 4090. Does that mean it NEEDS them to play the game?



And how is his methodology any different than me just grabbing a 4090 and putting settings that would stress it's performance, and then concluding that the card is crap and can't play modern games? Cause off the top of my head I can come up with at least 5-6 games that the 4090 drops to ~single digits or 20 fps. How can the conclusion of my little experiment be "just buy a 5090" instead of "just lower the damn settings"?
Because even at the lowest settings these new games are already allocating more than 8gb. The difference between a 4090 48gb and a 4090 24gb is going to be 0 in all these games regardless of settings. Same for 5060ti 16gb and 5060ti 32gb.

That’s how ram works - there’s only a difference if there isn’t enough of it.

The 8gb 5060ti = 1060 3gb. Same products within their respective times.
 
The 8gb 5060ti = 1060 3gb. Same products within their respective times.
Or HD4870 512MB
As I recall, that card initially benchmarked nigh identical to its 1GB counterpart, but started having problems with running out of vram a year or two down the line as new games became more demanding.

In that regard, the 5060Ti is in an even worse spot, because the card already demonstrably has problems with running out of VRAM at settings that'd otherwise be entirely playable at 60fps.
 
Quite obviously you care even less about the accuracy of your own statements. NVidia's margins on gaming cards are much slimmer than their other offerings; this is simple fact.


The 5070ti has no issues with melting power connectors. As for "awful crashing drivers", over the last 10 years, it's a safe bet to say NVidia's drivers have been considerably more stable than their competitors.


That's really your entire gripe, isn't it? You're bitter and feeling jilted? NVidia shows exactly as much loyalty to you as you -- a presumably rational consumer -- do them. You purchase the best product for your needs at the best price, and they attempt to maximize profits by producing products people will purchase. You may feign indignance over that self-interested relationship, but, to paraphrase Winston Churchill, it's the worst possible system -- except for everything else we've ever tried.


Then name it. What could they have done to control scalping? Be specific.

2023 report.

Nvidia investor reports

0.4 Billion in stock payback
10 billion in investor dividends
27 billion in revenue
56.9% gross margin

So, let's do some simple math. 27-10-0.4 = 16.6 billion to cover operating costs and the like. That 10.4 billion was pure profit that was returned to the company, so pleading poverty is complete BS. You will take the reinvestment into research out of gross margin...but if there was a net 0 change in liquid capital:

27*0.569 = 15.363. > 15.363-10.4 = 4.963. Yep, Nvidia has returned 10.4 billion by rebuying stock and dividends in 2023, and still has a profit of just shy of 5 billion to work with. This is profit, as the margin associated to overhead was only 43.1%.


Do you maybe get it now, or are you going to be intentionally obtuse and pretend like a publicly traded company doesn't have their performance reported to the public? If there was some magical razor thin margin it sure as hell isn't for a company that pays out 10 billion in dividend to a revenue of 27 billion, and pretending otherwise is being willfully stupid.


So you can benchmark this, and why I've stated multiple times that an 8GB 5060ti is garbage, Nvidia's gaming revenue was 9.1 billion in 2023. It was down 27% year over year. Despite this, their data center was up to 15 billion, or a 41% yoy increase. Yeah, seems like instead of getting their consumer gaming segment they've decided to give it the old heave-ho. Not something I think is somehow wrong, but it is something patently indicative of Nvidia giving up and instead of focusing on their 5060 segment deciding to label that as the level that this generation will be the one that ages the least gracefully given the astronomical pricing.




Let me also suggest that your attempt at putting the "control of scalping" into Nvidia's hands is simple. Two words. Car Dealership. Finite quantities go to trusted vendors. Vendors vet customers. Vendors get a middleman price cut, but are incentivized to get customers to sign legal documentation that says they cannot resell the cards for 6 months. If someone does the dealer and the customer are black listed, and not allowed to buy into things early anymore. See the obvious example with the GT40 and John Cena. Car and Driver look into the GT40 resale lawsuit
I'd laugh at this point, because I'm sure you want to poke holes into this, but you asked for one example. I provided you a real one. This is the point where you backpedal and tell me why the example is wrong, instead of thinking for about 20 more seconds and considering the win here is not the plan Nvidia's launch for them, but not to offer to polish their wedding tackle for them when they release a bad product at the price point they are demanding.
 
8GB is not worth it, if we stop buying crap like this companies will stop making them.
It's as simple as the rumors of Nvidia's new models with more VRAM being expected.
Everyone's needs are different and for some people 8GB will be enough, but that's not the point.
 
if we stop buying crap like this companies will stop making them.
Remember the 8800GTS 320/640, or GTX260 192/216, and all the other newer ones.. I don't think they will stop anytime soon :D
 
It's almost worth it ;)
 
I saved a few bucks and went with the GTS 320. I sold that thing within weeks.. I think I traded it, and an evga 680i for an 8800GTX and some cash.
 
The optional steam survery is very accurate, reliable and of course not skewed by gaming cafe pcs :D

February 2025

View attachment 397958

what?

I had imagined the USA being the dominant market for gaming GPUs? The Chinese accounting for 50%... thats a bit of an odd one. Not that I consider Stream surveys to be an accurate reflection of the realities on the ground, but sourcing 50% data from big ole China alone... dunno, perhaps 'made in China' replica reporting?
 
I see some paid tech outlets writing to avoid 8GiB VRAM graphic cards. I want to extend that to even 12 GiB VRAM.
8GiB cards are nice for retro gaming. Nothing wrong for such stuff or games which are low quality, e.g. fps shooters.

Let's see if there were any new insights in this topic from page 15 till page 31 now. I highly doubt, maybe I'm wrong

on my laptop i was the first time asked in around 25 years for a steam survey. I never was asked on my previous steam installations. That includes several times a fresh steam installation including a login with over 100 games in gnu gentoo linux or any other windows installation. That also includes my desktop computers. That steam statistics is flawed - for windows user only - for selected windows users only.

I saw that steam statistics survey the first time. First thing, do you want to be added to the statistics. Stop with the steam statistics nonsense. The data is not valid

edit: i understood the topics if 8Gib graphic cards are worth a purchase regardless the price. some say that certain regions want graphic cards with 8gib vram because of the sale price. i think that this is non valid argument for the topic question about the usefulness of 8gib graphic cards. yes 8gib graphic cards are wroth the money for 30€ with four dp 2.1 connectors in a single slot design. For a monitor expansion card are 8gib vram plenty, why not. 31€ and up i expect more. When a consumer buys new old stock like nvida 2060 or something they know what they are doing. New products should not be e-waste in the first place for several years.
 
Last edited:
I saw that steam statistics survey the first time. First thing, do you want to be added to the statistics. Stop with the steam statistics nonsense. The data is not valid
The steam hardware survey is only a valid source of information if it reinforces my biased opinions, hehe
 
8GB is not worth it, if we stop buying crap like this companies will stop making them.
It's as simple as the rumors of Nvidia's new models with more VRAM being expected.
Everyone's needs are different and for some people 8GB will be enough, but that's not the point.

Nah, 8GB at the right price is still perfectly adequate and fully capable for a good portion of todays gaming community. This is indisputable. There are plenty of good options available and for those unlikely to exceed 8GB, previous generation and more affordable graphics cards still offer more than enough performance, in-fact blazing fast! If anyone is still refuting this (not directed at AVARAT), as a brother from another mother, i urge you to plug yourself into the silicon matrix and update your firmware.

If you're referring to 50-series, then theres no 2 ways about it, as certain as the stars in the heavens and thunderously loud clarity beyond dispute, the "current Gen" upper 60-class 8GB card for its asking price is absolutely not worth it, wouldn't touch it, not even with a 10-foot telescopic pole with thick rubber gloves on. Its all down to performance segmentation and pricing... at this tier, the upper ceiling TI moniker and its price point should deliver uncompromised performance for lower resolution gaming and hold its own at 1440p with 60FPS+ on high or max settings - including 'most' newer AAA titles. 8GB betrays that space - thats the whole damn truth and nothing but the truth, delivered faster than your GPU can render O-T-H-E-R-W-I-S-E

There is no room for debate, just misplaced sensibilities trying to pass as logic. Now, if people are super delighted and graciously receptive to buying a brand new expensive 'fast' car, capable of 120mph (looks great in the review charts), at the cost of dynamically murked acrylic windows - props to them! Maybe I'm a special type of gamer who likes my 'glass' windows spotless, so every sunrise, mountain, green and open road feels alive with all its glory. YUM YUM. I don't mind some sacrifices which are either subtle and do little to compromise visual fidelity or necessary concessions for odd-ball heavy hitters but full blown trade-offs for $400+ for low-res or a peg up gaming in 2025 - thats not consumer-centric logic, thats a new benchmark record for BS.

LOL, no offense to anyone, but I can’t help picturing Don LaFontaine’s voice over the radio: '8GB Avengers: Champions of Overpriced Mediocrity - coming to a screen near you, 2025. Oh, and viewer discretion advised: pretty pretty candy on top please, lower your expectations going in"

Credit where its due, the 16GB variant was a good call for the class of 60. Price? not so much. But then again pricing is a problem plaguing GPUs across the board.
 
Last edited:
Status
Not open for further replies.
Back
Top