• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

5060 Ti 8GB DOA

Status
Not open for further replies.
Don't understand an 8GB GPU in 2025.
 
Don't understand an 8GB GPU in 2025.
The problem is that games require more vram and look worse visually. So no matter what you do, it's never enough. But it's nice not having to move sliders just to get it to barely work.
 
Hey guys!

This is for everyone defending 8 GB cards in 2025. The cheapest model HU was able to get cost 400€. It's not even able to play some games in 1080p properly.
Here's the video:

And here are some screenshots from the video:
View attachment 396176

View attachment 396177

View attachment 396178

View attachment 396179

Can't wait for @W1zzard to get his hands on one :laugh:
It could be HBM4, and 8GB might not be enough, these days.

Fury X fell by the wayside pretty quickly due to its limited VRAM, and 8GB Vega 64 is really only useful for retro builds.
Only the 16GB VIIs have any 'teeth' today, and 90+% of the non-WC'd Non-Pro cards self-terminated from thermal warpage.

As we sit today, 8GB cards are the 4GB cards of yesteryear, and 12GB has become a bare minimum.
For example: my housemate manages to run out of VRAM on his 16GB 9070XT in Beam.NG (w/ mods), @ 1440p...

Maybe, things would've been different if DirectStorage had been better (and more-broadly) implemented.

Don't understand an 8GB GPU in 2025.
Entry-Level 3D Acceleration.
These will probably be *the* card seen in most of the 'gaming' Dell Inspirons and HP Pavilions (assuming, they even still use those brand-lines; it's been awhile, for me.)
 
Don't understand an 8GB GPU in 2025.
I can't understand 540$ either, but there it is.

Past cards, 12GB but the GPU was too weak anyways.

Today's cards, GPU strong, pinch the GB baggy of the capacity nugs.
 
Fury X fell by the wayside pretty quickly due to its limited VRAM, and 8GB Vega 64 is really only useful for retro builds.
Only the 16GB VIIs have any 'teeth' today, and 90+% of the non-WC'd Non-Pro cards self-terminated from thermal warpage.
I understand your point, but the cards you're mentioning didn't have such problems running games when they came out. The only way of "breaking" Fury's performace was using unreasonable settings at the time (4k with everything maxed out) and that's not the case here. In most of the games tested, we can see 1440p with DLSS quality, which is native 1080p with single digit % more VRAM usage because of upscaling and the card is struggling.

The Vegas are able to pull their weight even today(yet while using a lot of power). The 56 was a direct competitor to the 2060, and the 2060 died off way quicker, because it "only" had 6GB of VRAM vs the Vega's 8. I still remember my friend complaining "his world not loading" in Forza 5 when using high textures. I was able to crank that up to max on my Vega and still not encounter problems.

Not being able to play a game at 1080p native at more than medium settings for a price of 400€, just because Nvidia decided to shave off half of the cards VRAM is just outrageous.

In short, this should have been a GTX5050(because it's useless in RT) priced at no more than 250€(and that's a stretch).

Let's hope AMD doesn't make the same mistake and hopefully they release the 8GB 9060 as a non-XT part, priced accordingly.
 
Part of what sucks with modern titles is you can be fine 90% of the time, but then you get to scenes where it turns to garbage. I recall Forbidden West doing that. Most of the original game ran fine, but the expansion world got laggy as hell in places.
 
Buying this card would be madness, its barely any cheaper than the 16 gig variant. I did link some analysis done on the VRAM problems, but it had no traction and I cant be bothered to dig out the link again.

Depends what games and settings he uses. His review of the 4060 Ti 16GB concluded it was no better than the 8GB in most games. If he uses the same games and settings, I expect he'll reach a similar conclusion about the 5060 Ti. We'll see.

I hope he also pays attention to the texture quality because that is another thing which can escape attention by just looking at the raw numbers. The framerate might be the same on the 8GB card but if the textures haven't loaded and look all blurry, it's not a fair comparison.

Another thing I saw, I think in a JayzTwoCents video, sometimes the framerate counter will say eg. 30 but the real gameplay has turned into a single figure slideshow. Sometimes the frame counting software doesn't seem to realise it's getting it wrong.

Yes this happened as well to DF when they were reviewing FF7 remake.

Reviewers sadly need to do their tests interactively, dont just run it in a room somewhere and look at graphs, they have to look at texture issues, texture quality, and actual frame stalls live.

J2C recorded the FPS counter anomaly live, so there should be no debate now on that being a real issue.

The 5060Ti 8GB needs to be treated as the type of card it is, a low-end GPU ideally meant for 1080p gaming. This means you keep settings realistic that it can handle so it's not swapping textures in a saturated VRAM situation that tanks the performance. The screens you posted do not do this, they're simply trying to make the card work hard at something it clearly cannot support like the 16GB variant.

Browsing through the HWU video it would have been better to post images of the medium settings tested to show that the card is indeed capable of what it's designed for, low-end gaming by today's standards. Sadly, most of the games tested were always tested at high/max settings with and without RT that showed the card unable to perform; the exact way we'd expect it to fail with only 8GB.

View attachment 396185 View attachment 396184
View attachment 396183 View attachment 396182

These screen grabs show the card is capable of playing these games fairly well when set to acceptable setting levels and resolution. Anyone that gets this card or even the 16GB version and expect to max out settings and have everything playable at quality performance levels, you're sadly mistaken and need to rethink what you're doing.

I'm not supporting this 8GB card and saying it's a good value (it's clearly not and I'm not saying the 16GB version is a good value because I don't feel that it is), but what I am saying is that the card needs to be treated exactly as to what it is; a low-end, overpriced, gaming card not meant for maxing out settings over 1080p....and even then that's a bit of a stretch to do with some games.
Rendering resolution is independent of texture resolution. If a game can only do PS3 textures with massive weird glitches, going down to 1080p wont save it.

Non Ti 5060 will only be available in 8 GB iirc. However, I don't agree with Hardware Unboxed's testing on this one. They can't put "very high, ultra textures, 4K resolution" on a 8 GB card, and they know that very well. In a 1080p scenario, 8 GB will still be acceptable for the time being, and this is what the RTX 5060 series was designed for, including the Ti. Sure, the wisdom of getting the 8 GB model when they're priced within 20% of each other is questionable, but buying a 5060 Ti for 4K gaming is equally questionable as well - unless you're the sort that believes that the Nintendo Switch 2 will truly do 4K with its embedded GPU that doesn't match up to the MX 570.

Nobody will complain if RX 9060 comes in 8 GB and 9060 XT comes in 8 and 16 GB flavors - there'll be a 9070 GRE to slot in the middle too, just where the RTX 5070 currently resides, as long as the price is good on all three of these. All this bashing and in reality, all these things need is to have their price slashed, quite possibly in half. But that's not happening, because the semiconductor industry is pressed to the breaking point. GPUs will only get pricier and as we've seen with the 9070 XT and the RTX upper range, MSRP is a fantasy and very few people got or will get an MSRP card at all.
1080p FF7 Remake will have issues on 8gb, even with textures set to low. How do I know? I had issues on a 10 gig card.
 
really have not seen people "defending 8 cards" outside of entry level cards. Virtually everyone thinks 8GB on a 5060ti card is silly now but the lack of RAM is not the only issue on those cards, the small bit bus doesn't help it.

Pretty much all the other sites with a robust game suite came to the same conclusion as W1zzard for the 4060ti 8/16 GB
You should be looking at the comments on the 5060ti reviews. There are absolutely members here that still defend 8GB cards. Including _roman_ , who in this thread is trying to find justification for such a pointless card existing, because we just cannot admit that 8GB GPUs belong int he dustbin of history.
The 5060Ti 8GB needs to be treated as the type of card it is, a low-end GPU ideally meant for 1080p gaming. This means you keep settings realistic that it can handle so it's not swapping textures in a saturated VRAM situation that tanks the performance. The screens you posted do not do this, they're simply trying to make the card work hard at something it clearly cannot support like the 16GB variant.

Browsing through the HWU video it would have been better to post images of the medium settings tested to show that the card is indeed capable of what it's designed for, low-end gaming by today's standards. Sadly, most of the games tested were always tested at high/max settings with and without RT that showed the card unable to perform; the exact way we'd expect it to fail with only 8GB.

View attachment 396185 View attachment 396184
View attachment 396183 View attachment 396182

These screen grabs show the card is capable of playing these games fairly well when set to acceptable setting levels and resolution. Anyone that gets this card or even the 16GB version and expect to max out settings and have everything playable at quality performance levels, you're sadly mistaken and need to rethink what you're doing.

I'm not supporting this 8GB card and saying it's a good value (it's clearly not and I'm not saying the 16GB version is a good value because I don't feel that it is), but what I am saying is that the card needs to be treated exactly as to what it is; a low-end, overpriced, gaming card not meant for maxing out settings over 1080p....and even then that's a bit of a stretch to do with some games.
The 5060ti 8GB IS being treated for what it is: a gimped 5060ti that cannot use its full potential because of it's silly low VRAM capacity.

It should not exist. Cut it in half and sell it as a $120 5050 instead.
 
1080p FF7 Remake will have issues on 8gb, even with textures set to low. How do I know? I had issues on a 10 gig card.

Not trying to defend it (not at this price, at least), but that's where I'll have to argue, if a game can't run on 8 GB at 1080p at low settings, then I reckon the problem isn't the hardware. The Xbox Series S got less memory than that to be shared between the program and graphics memory (10GB total, its OS and hypervisor take about 3-4 GB), and games gotta run on that hardware. In this scenario, 8 GB of graphics memory manages to be a luxury.

You should be looking at the comments on the 5060ti reviews. There are absolutely members here that still defend 8GB cards. Including _roman_ , who in this thread is trying to find justification for such a pointless card existing, because we just cannot admit that 8GB GPUs belong int he dustbin of history.

The 5060ti 8GB IS being treated for what it is: a gimped 5060ti that cannot use its full potential because of it's silly low VRAM capacity.

It should not exist. Cut it in half and sell it as a $120 5050 instead.

I believe there will be a GB207 chip positioned below this for the RTX 5050. But it retains the same 128-bit memory interface and 8 (potentially 16) GB configuration. This is already the lowest that Nvidia is willing to go, even in SKUs 2 tiers below this - which is what makes it so jarring. A product they are selling at this price really needed to be 16 GB by default, IMHO.
 
8GB would still work if graphical fidelity is not what you are looking for. Just looking at my 3070Ti in my kids rig, it still does ok..

I paid 1300 for that GPU during those dark days, that mofo is getting tossed under my coffin before they lower me down.
 
it wasn't even six years ago when the GTX 1660ti came out for $280. It was part of a long line of GPUs for $200-300 that delivered solid performance in gaming for a reasonable price. Granted the goal posts have moved to 1440p and higher FPS but now people have to spend twice what they used to spend for GPUs that aren't even close to offering the rasterization price/performance return the old cards had meanwhile Nvidia & AMD are saying look at the great RT & upscaling you get that you didn't ask for. I totally get people's frustration especially with these lower mid level cards.
I wouldn't say that the goalposts have moved, I would say that a 32" 1440p monitor can be had for less than $200, including tax & shipping.
 
Their settings are based on 5060TI 16GB's own performance and they just picked what 16GB can run on comfortable over 60FPS.
So what you‘re saying is they intentionally picked settings that a 16GB card will run and a 8GB card won’t.

Who needs integrity when you earn money from clicks.
 
8GB would still work if graphical fidelity is not what you are looking for. Just looking at my 3070Ti in my kids rig, it still does ok..

I paid 1300 for that GPU during those dark days, that mofo is getting tossed under my coffin before they lower me down.
Vulcan Salute emoji
 
8GB isn't the problem - its relevant and still sufficient for many gamers, especially those playing lightweight competitive or just less demanding titles, which likely represents a large portion of the gaming community.

The real problem is the wallet pinch - overpriced GPUs and budget-buyer inaccessibility. Basically gimped crap for premium prices for hardware that while capable in certain scenarios, doesn't offer the performance/longevity to justify the cost. I get the price and performance mismatch frustration, especially when previous generations offered similar or better value at lower costs or in the least we had alternatives to fall back on. Today its a different ball-game - manufacturer induced inflation, hardware gimping, scalped-driven high prices and on top the consumer market isn't getting the love it deserves (supply constraints/data-AI-first prioritising)

If Nvidia had launched a standard 60-class GPU with 8GB of VRAM at ~$250 and a 16GB TI variant at $350, there wouldn’t be nearly as much backlash. Completely dismissing 8GB cards doesn’t seem realistic, however completely dismissing 8GB cards at their current price points is as realistic as expecting your toaster to run Cyberpunk 2077.
 
As for DOA status (atleast among us hobbyists, consumers will eat this shit up as per usual, not really worth getting mad about)
Yeah well I'm kind of mad about it, because I know what it will lead to, people think I just paid a lot of money for a new graphics card, then when they realize it can't play the game they want to play at the settings/res they wanted to play at, they will blame the developer. I see it ALL THE TIME on steam forums. Always developer fault. Never nvidia fault or consumer fault when it comes to this. All they see is they paid a lot of money (even if its a low end card with gimped vram, its still expensive) and they think that means they should be able to play the game at ultra/whatever settings.

Okay maybe not every casual thinks that way, but a lot do. So.... thats why I think it does kind of matter. And thats what frustrates me most about it. Especially since I tend to end up getting sucked into conversations with people on 8gb cards trying to get games to work, sometimes they are angry and just blowing off steam, but some genuinely want help and so, I do my best, but oft-times they don't like the answers I give. Nobody wants to hear, use more upscaling/reduce quality settings/use dynamic resolution scaling with a low frame cap/don't use framegen/close all other apps ( many use vram these days - even steam itself unless you tell it not to) as the solutions.... And like I'm doing this for 30 series cards, 40 series cards, and knowing I'm probably going to do it for 50 series cards that cost an arm and a leg too.... ugh.

I know I don't have to help and could just cut off but you know what its like when you love a game, you want other people to be able to enjoy it too.
 
Last edited:
Don't understand an 8GB GPU in 2025.
They were relevant in 2014-2019/20, but with data sizes getting larger, 8 is no longer enough, 12 should be minimum.
 
I’m sure you’ll be just as vocal when AMD releases an 8GB card, right? Right??

What bothers me personally isn't even the price per se (as things are, they SHOULD be much lower) - it's that Nvidia is hoarding all the 24 Gbit GDDR7 ICs for their high-margin professional GPUs. With GDDR7, they could have perfectly released a 12 and a 16 GB version of the 5060 Ti, leaving 8 GB for the 5060 and 5050 tiers.

AMD cannot do something similar with the RX 9060 series (likely coming in both 8 and 16 flavors) or 9070 GRE (which will likely be exclusively 16 GB) because they are intentionally using obsolete GDDR6 memory to keep costs down, and there aren't any GDDR6 24 Gbit ICs in the market.
 
Yet they are rocking nv that is using expensive gddr7...

Hey guys!

This is for everyone defending 8 GB cards in 2025. The cheapest model HU was able to get cost 400€. It's not even able to play some games in 1080p properly.
Here's the video:

And here are some screenshots from the video:
View attachment 396176

View attachment 396177

View attachment 396178

View attachment 396179

Can't wait for @W1zzard to get his hands on one :laugh:
Yeesh they are scamming the people
 
Hey guys!

This is for everyone defending 8 GB cards in 2025. The cheapest model HU was able to get cost 400€. It's not even able to play some games in 1080p properly.
Here's the video:

And here are some screenshots from the video:
View attachment 396176

View attachment 396177

View attachment 396178

View attachment 396179

Can't wait for @W1zzard to get his hands on one :laugh:
Look, do NOT expect NVIDIA or AMD (even INTEL) to increase the bus width beyond 128-bit for the xx60 series class cards.

128-bit means four 32-bit memory modules. There are no 3GB memory modules (for the mainstream) yet so 4x2GB=8GB is the current maximum for this specific bus width configuration (on one side of the board).

In 2026, at CES, there will be a (refresh) GPU lineup coming our way, that's when they plan to officially bring 3GB memory modules to the mainstream market.

In other words, the upcoming 5060 SUPER will get 12GB VRAM on the same 128-bit bus width.

When 4GB memory modules become fully developed, matured, mass-produced and have got wide-availability status, then the 6060 series cards will be shipped with 16GB VRAM on 128-bit (by default, so no need for two-side memory configuration like the original 3090 and other similar cards).
 
AMD cannot do something similar with the RX 9060 series
But they could. No one pushed them into developing a die that can only handle 128 bits worth of bus. Could've easily gone for 160, 192 or whatever configuration. They've just got greedy.
9070 GRE (which will likely be exclusively 16 GB)
Exclusively 12 GB. It's a very sad product for an xx70 tier one. Not only because of VRAM amount but also because it's weak in general.
 
8 is no longer enough
That is, of course, an opinion. As has been shown by objective testers other than HUB, it's not one supported by merit. 8GB has been fine for long time. It doesn't magically become not fine just because certain people say so. And it certainly doesn't become not fine because of biased sellout's like HUB say so. It also doesn't become not fine because well respected reviewers like J2C and GN say so.

People are aggressively pushing for greater VRAM amounts not because it's not enough to game on but because it's popular opinion. They're doing that instead of looking at objective, meritful testing. It is deliberate misinformation at best and arrogant ignorance at it's worst. People desperately need to go back to that thing called "thinking" rather than knee-jerk reacting with their feelings.
 
Last edited:
I'm not sure why so many people are attempting to throw HUB under the bus as if they're doing anything untoward. I'm guessing it's mostly people who didn't watch the video in question and commented about it anyways. Steve said multiple times that this isn't indicative of most games. They also used settings that ended up with reasonable frame rates on the 16GB card then dropped them until the 8GB ran well enough. Then the last three games benchmarked were ones that didn't have any real performance difference.

At the end of the day the 5060 Ti is a decent GPU that is fast enough to drive resolutions and features that can benefit from more than 8GB VRAM. If they had allocated 3GB memory IC for these cards I'm fairly certain a 12GB 5060 Ti would have been sufficient.

Intel releasing the B580 with 12GB VRAM at a $250 price point makes any cards selling at $300+ with 8GB VRAM a poor deal. As always you can get around a VRAM limitation, but with the prices being paid should you have to?
 
Status
Not open for further replies.
Back
Top