• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

5060 Ti 8GB DOA

Status
Not open for further replies.
Yes, it did. Pretty much to the same extent as 5060Ti 8GB does now. It was handwaved away as 4GB being cheaper and it's not high-end anyway :)

Prove it, provide a link. To the same extent as the 5060 Ti 8GB is a pretty bold claim given you cannot even all the card's marquee features without it tripping over itself.

There is some voracity to the pricing claims, the 480 4GB at $270 with inflation is significantly cheaper that 5060 Ti 8GB MSRP and especially the actual stree price. They are not even in the same tier price wise so that should be considered.

What? 5060Ti are $379 and $429. Looking at actual prices, 400€ and 450€ seem to be it which are EU versions of MSRP. 16GB is 50£$€ more expensive than 8GB.
Yes, prices are higher than they were years ago. Yes, they have increased more than inflation.

You are thinking of MSRP, which again I cited as the start of the provided pricing range. Street pricing is as I stated at the top of the range:

1745599557383.png


I don't understand why you are acting surprised in regard to my pricing, it's based on the MSRP and the actual price in the US market. I also don't understand the insistance on pushing MSRP pricing when we haven't seem MSRP pricing on 5000 series GPUs.

My point was that the price difference between 5060Ti 8GB and 16GB isn't that much larger than 4GB vs 8GB was for 480 and it isn't.

So you are telling me quibble you had was over the $10 difference between our two figures, which doesn't change your conclusion here anyways. $30 or $40 price difference, pricing was still similarly structured.

That's such a pointless thing to argue and it doesn't really explain what makes my prior comment "weird" as you called it.

The whole issue of basing comparisons on ultra-high-end is something I think is utterly unproductive but this is a different discussion/argument :p

This isn't ultra-high end. It's not even midrange.
 
You brought an example of RX480 8GB as something positive - a card back then with a good amount of VRAM. When pointed out that there was also a RX480 4GB for $40s less you seem to think that is also a positive or at least a neutral thing. Why doesn't the same apply to the RTX5060Ti? There is an RTX 5060Ti 16GB and there is an RTX 5060Ti 8GB for $50 less. What makes this situation dissimilar?
 
I don't understand why you are acting surprised in regard to my pricing, it's based on the MSRP and the actual price in the US market. I also don't understand the insistance on pushing MSRP pricing when we haven't seem MSRP pricing on 5000 series GPUs.

Didn't they already say the price in euros? In the EU and UK the 5060 Ti and the 5070 are available for MSRP. So yes we have seen it.
 
Last edited by a moderator:
You brought an example of RX480 8GB as something positive - a card back then with a good amount of VRAM. When pointed out that there was also a RX480 4GB for $40s less you seem to think that is also a positive or at least a neutral thing. Why doesn't the same apply to the RTX5060Ti? There is an RTX 5060Ti 16GB and there is an RTX 5060Ti 8GB for $50 less. What makes this situation dissimilar?

No, I brought the RX 480 8GB in as an example of a 8GB card from 2016. Again, as I've already stated, the point was to illustrate the regression in value.

You were the one who came in and brought up the 4GB model and questioned why I didn't use the 4GB card. If the light I put the 4GB RX 480 in seemed positive, that's because it's relative to the 5060 Ti and it's a comparison YOU wanted. Not the one I originally made.

So I'm not exactly sure what you want here, you complain that I'm weird for comparing the 8GB 480 to the 8GB 5060 Ti and then complain that I'm showing the 4GB model in too positive a light, ignoring that we are comparing it specifically to the 5060 Ti 8GB and that you are the one that compelled that comparison in the first place.

Also, still waiting on the evidence that backs up your claim that the 4GB RX 480 was as bad at launch as the 5060 Ti 8GB.

Didn't they already say the price in euros? In the EU and UK the 5060 Ti and the 5070 are available for MSRP. So yes we have seen it.

This sub-discussion started with my comment, which was priced in USD. You are free to state your own regional pricing but if you are going to contest an argument on the basis of pricing, you need to reply in the currency used in the original argument. This should go without saying, it's not a counter-argument if both statements can be mutually true or aren't using the same values. In the instance of differing regions / currency, a simple "That's unfortunate for you guys. Luckily for us in the EU, pricing here is able to hit MSRP" makes sense where you'd like to note that your regional pricing does hit MSRP but recognize that EUR and US pricing isn't interchangeable and that it's not a counter to argument to a different market's pricing.

You seem to be under the impression that this argument was in EUR originally but it was in USD originally. I don't mind people stating their own currencies but to reply to a comment saying they are wrong on the basis of pricing and then going and using a different currency is not a counter-argument. It's just rude and ignorant, they are entirely different values. It's on the level of the stupid Americans pretending as if their country is the only one that exists. If your goal was to bring your arguments closer to theirs, congrats mission accomplished.
 
Last edited by a moderator:
This sub-discussion started with my comment, which was priced in USD. You are free to state your own regional pricing but if you are going to contest an argument on the basis of pricing, you need to reply in the currency used in the original argument. This should go without saying, it's not a counter-argument if both statements can be mutually true or aren't using the same values. In the instance of differing regions / currency, a simple "That's unfortunate for you guys. Luckily for us in the EU, pricing here is able to hit MSRP" makes sense where you'd like to note that your regional pricing does hit MSRP but recognize that EUR and US pricing isn't interchangeable and that it's not a counter to argument to a different market's pricing.

You seem to be under the impression that this argument was in EUR originally but it was in USD originally. I don't mind people stating their own currencies but to reply to a comment saying they are wrong on the basis of pricing and then going and using a different currency is not a counter-argument. It's just rude and ignorant, they are entirely different values. It's on the level of the stupid Americans pretending as if their country is the only one that exists. If your goal was to bring your arguments closer to theirs, congrats mission accomplished.

It's an international forum and if someone is going to say the 5000 series isn't selling for MSRP, they ought to qualify it with 'in the US' if that's what they mean. Otherwise the statement isn't true, most of the world is not the US.
 
If 50 series had 0.7Volts, it would not burn and be efficient. 2GHz like 30 series but at 1/3 Wattage so that you could run a 5090 with 200Watts using 8pin + slot power.

5090 at 200Watts would beat 3090 at 200Watts by a good margin.

This is the source of the 5070-5090 higher VSync power use this generation. All anyone needs to do to identify this is look at the GPU's voltage at VSync in the Efficiency & Clock Speed section, here it is for the 5070:

Screenshot 2025-04-25 at 12.29.04 PM.png


0.998 V average (with an 0.990 V minimum) is quite a high minimum voltage and simply bad (and maybe lazy) efficiency. For the past few gens before 4xxx, you could scale down to the 0.7xx V range to save more power but in the 4xxx series that minimum ended up being in the ~0.86 V range. However if you look at power usage that gen, it still scaled overall power usage down pretty well with the 4070 and even 4080 Super using pretty minimal power, so 0.86 V seemed to provide enough efficiency, even if it wasn't reaching as low a voltage as previous gens.

The efficiency has been different on the AMD side with the 7xxx series scaling voltage down nicely but the power usage not falling as well as the previous 6xxx and 5xxx gen. So higher power usage at low demand last gen which was a bit disappointing (I have a 7700 XT). This might be attributed to increased power needed to feed the chiplet design, not sure. But this gen AMD is back to monolithic and I see the 9070/XT VSync in the 0.7xx V range and what do you know? They have better low-end efficiency than Nvidia this gen.

Low voltage scaling is a good thing and I hope to see it come back to Nvidia's higher-end GPUs. The new 5060 Ti has a lower voltage mode and it shows good low-power behavior like the 4060 Ti I have, so I hope the 5070-5090 higher voltage minimums are a temporary problem.

If you haven't guessed by now, I find @W1zzard 's Voltage measurements an essential part of any review and he's the only reviewer I see doing these. Major kudos for that!
 
Last edited:
It's an international forum and if someone is going to say the 5000 series isn't selling for MSRP, they ought to qualify it with 'in the US' if that's what they mean. Otherwise the statement isn't true, most of the world is not the US.

Which is what the $ symbol implies. Do you think it meant marshmallows?

Most online reviews including TPU's, GN's, HWUB's, etc, use the $ (USD) symbol. Do they have to state which specific markets it applies to? No, that's stupid. It's already obviously denoted.

Even Kitguru uses both pounds and USD but even if they just used pounds, it's not like I'd go and shit on them because they failed to disclose that pounds = UK. That's bloody fricking obvious. Almost every review outlet and their reviews are in err of your inane stipulation.

It's pretty hypocritical as well, you used Euros but which market does that apply to? There are 27 different member states of the EU, it could apply to any one of them. You don't even follow your own rules. In reality, it doesn't matter because most people have a brain and know how to check local pricing.

I'd also like to point out that if there was a question of which market certain pricing applies to, the appropriate thing to do would be to ask. Not call them wrong and use a different currency altogether. The argument you have formed is not one of any legitimate concern but one from bad faith and bar changing.
 
Last edited:
What do you think PS6 will change?
Paradigm.


First and foremost, game developers develop their games for consoles first, then they port them to PC.

The majority of console owners own a PS5 aka 6700 XT performance aka 2080 Ti-like (in raster), and their (game devs) target FPS (for AAA story-driven titles), you guessed it, the glorious 30 FPS.
Gotta be kiddin' me, SMH

It WILL cost them time and money to "optimize" for literally anything else, unless sponsored by the RGB, OFC.
Sponsored by the RGB, usually means xx80 class cards for the far right graphical options AAA gameplay experience aka not affordable

The PS6 will be based on RDNA5+ (UDNA) architecture aka RT performance parity with the green team (if RDNA4 is taken as an indication). It will have at least 24GB VRAM (most likely 32GB) capacity. The PS6 will provide 5090+ performance at fourth the card's MSRP.
The RDNA4-based RX 9070 XT is officially faster than the 3090 Ti, in raw RT performance

When PS6 gets launched in 2028 or so, the paradigm won't shift right away, it takes around two years for people to switch to the new console and for the PS6 to become the new majority holder, THAT is when the game devs will finally begin to truly develop (uncompromisingly) for the PS6 exclusively, THAT is when RT will finally become mainstream (in its true meaning).
Even w/o upscaling, you can enjoy to your heart's content by experiencing RT maxed out 4K gameplay at 100% native render resolution


So again,
Things won't improve much, until PS6 comes out.

Much love:love:
 
This is the source of the 5070-5090 higher VSync power use this generation. All anyone needs to do to identify this is look at the GPU's voltage at VSync in the Efficiency & Clock Speed section, here it is for the 5070:

View attachment 396824

0.998 V average (with an 0.990 V minimum) is quite a high minimum voltage and simply bad (and maybe lazy) efficiency. For the past few gens before 4xxx, you could scale down to the 0.7xx V range to save more power but in the 4xxx series that minimum ended up being in the ~0.86 V range. However if you look at power usage that gen, it still scaled overall power usage down pretty well with the 4070 and even 4080 Super using pretty minimal power, so 0.86 V seemed to provide enough efficiency, even if it wasn't reaching as low a voltage as previous gens.

The efficiency has been different on the AMD side with the 7xxx series scaling voltage down nicely but the power usage not falling as well as the previous 6xxx and 5xxx gen. So higher power usage at low demand last gen which was a bit disappointing (I have a 7700 XT). This might be attributed to increased power needed to feed the chiplet design, not sure. But this gen AMD is back to monolithic and I see the 9070/XT VSync in the 0.7xx V range and what do you know? They have better low-end efficiency than Nvidia this gen.

Low voltage scaling is a good thing and I hope to see it come back to Nvidia's higher-end GPUs. The new 5060 Ti has a lower voltage mode and it shows good low-power behavior like the 4060 Ti I have, so I hope the 5070-5090 higher voltage minimums are a temporary problem.

If you haven't guessed by now, I find @W1zzard 's Voltage measurements an essential part of any review and he's the only reviewer I see doing these. Major kudos for that!
I use 2825MHz at 0.9V. But before the last driver it was higher frequency. This is 165Watts in planetside 2. Later I will test 0.8V. Because even at 0.85V it has only 90-100Watts.

Latest drivers have a hard-coded limit for 2500 MHz for 0.82V in Afterburner, when I set it higher, it sets it 2500MHz. Then, it averages the frequencies around 0.82V which leads to ~2400 MHz due to the same driver's behavior (which is different than before).

So, with last driver's 10% gimped state, these are results:

Furmark
Ekran görüntüsü 2025-04-26 145141.png


Normal benchmark of Kombustor rendering
Ekran görüntüsü 2025-04-26 145255.png


25 minutes of Aliens Fire Team Elite at maxed settings (57-62 FPS)
Ekran görüntüsü 2025-04-26 150506.png

4070 at default gets 40-45 FPS in same scene. So even at 0.82V, 5070 is 30% faster than 4070 at least in this game and about 40% in furmark.


In planetside 2 with 200 players in battle, gpu is 49Celcius 88-92Watts. But on empty desert, its 130Watt.
 
Let see how DOA the 5060Ti 8GB will be when it's the only GPU that sell at MSRP and everything else is overpriced :roll: .

HUB bashing 4060 and 4060Ti 8GB didn't exactly stop them from being popular.

That was more than HUB, even Gamers Nexus, & others said not to buy the RTX 4060 at all.
No body was "influenced" at all I guess.
 
That was more than HUB, even Gamers Nexus, & others said not to buy the RTX 4060 at all.
No body was "influenced" at all I guess.

Put simply: I read and watched every review of the 4060/Ti 8GB and understand the criticisms and IMO they're valid if exaggerated by some.

Yet I bought an 8GB 4060 Ti anyway because it fits what I want to do with it. In the long run I may regret not spending $450 for 16GB instead of $380 but my focus has always been efficiency, especially in lower power situations because almost all of my regularly played games are not OTT demanding and I use optimized settings. Plus it fit perfectly in my kid's PC (after the old 1060 died) who plays similar mid-demanding games.

If I was playing mostly 2023+ games then I wouldn't consider the 8GB xx60 Ti model. Really nobody should, save a little more money and buy the 16GB or the xx70. But the 8GB xx60 Ti has a place and does a good job in it. I just wish it was a little cheaper but there are few things not included in that basket.

I use 2825MHz at 0.9V. But before the last driver it was higher frequency. This is 165Watts in planetside 2. Later I will test 0.8V. Because even at 0.85V it has only 90-100Watts.

Latest drivers have a hard-coded limit for 2500 MHz for 0.82V in Afterburner, when I set it higher, it sets it 2500MHz. Then, it averages the frequencies around 0.82V which leads to ~2400 MHz due to the same driver's behavior (which is different than before).

So, with last driver's 10% gimped state, these are results:

Furmark
View attachment 396912

Normal benchmark of Kombustor rendering
View attachment 396913

25 minutes of Aliens Fire Team Elite at maxed settings (57-62 FPS)
View attachment 396914
4070 at default gets 40-45 FPS in same scene. So even at 0.82V, 5070 is 30% faster than 4070 at least in this game and about 40% in furmark.



In planetside 2 with 200 players in battle, gpu is 49Celcius 88-92Watts. But on empty desert, its 130Watt.

Dude you're getting 0.82 V under lowered demand with the 5070, that's great! Based on W1zz's tests it seemed that wasn't happening at least with the release drivers. I think I'm understanding what I'm seeing here but I don't like to use Furmark or Kombustor, rather Steel Nomad or Time Spy Gfx test 2 as they feel more like a regular game GPU load. But those Aliens File Team Elite numbers look awesome to me.

That's great to know that lower voltages are available to the 5070, and likely the 5070 Ti/5080 as well.
 
Put simply: I read and watched every review of the 4060/Ti 8GB and understand the criticisms and IMO they're valid if exaggerated by some.

Yet I bought an 8GB 4060 Ti anyway because it fits what I want to do with it. In the long run I may regret not spending $450 for 16GB instead of $380 but my focus has always been efficiency, especially in lower power situations because almost all of my regularly played games are not OTT demanding and I use optimized settings. Plus it fit perfectly in my kid's PC (after the old 1060 died) who plays similar mid-demanding games.

If I was playing mostly 2023+ games then I wouldn't consider the 8GB xx60 Ti model. Really nobody should, save a little more money and buy the 16GB or the xx70. But the 8GB xx60 Ti has a place and does a good job in it. I just wish it was a little cheaper but there are few things not included in that basket.



Dude you're getting 0.82 V under lowered demand with the 5070, that's great! Based on W1zz's tests it seemed that wasn't happening at least with the release drivers. I think I'm understanding what I'm seeing here but I don't like to use Furmark or Kombustor, rather Steel Nomad or Time Spy Gfx test 2 as they feel more like a regular game GPU load. But those Aliens File Team Elite numbers look awesome to me.

That's great to know that lower voltages are available to the 5070, and likely the 5070 Ti/5080 as well.
But I didnt test video encoding/decoding. Anyway, I dont care about video decoding encoding. I have another gpu in system to offload anything "power saving mode" in windows 11. Assuming windows 11 interprets youtube as power saving mode type of work and offloads to 4070.

I used curve. If you mean default settings, it boosts depending on load. Lower if low load. But old drivers were maxing the boost.
 
Last edited:
  • Like
Reactions: NSR
Not necessarily.

They simply didn't have the means to go upwards of xx60 class cards.
Yup, technically it's still the cheapest new generation card and so will naturally be the bulk of sales.

NVIDIA told HUB the reason they're doing 8GB again is because the 4060 sold very well and is popular on hardware surveys so obviously 8GB isn't an issue for consumers. :roll:

Yeah, just ignore it's the cheapest card in the line-up and most buyers probably don't know the implications of VRAM. With these conditions, it's supposed to sell well, the issue is the obvious planned obsolescence on a modern "RTX" card.

Before this it was the 3070 8GB, how poorly that's aged.
 
Last edited:
why are game devs so shit at optimization now?
Here's some ideas

1) Because they have to take 8gb cards into consideration, when often the code they're dealing with was designed with ~10 - 13gb in mind for visuals ( from a total of 16gb gddr... and 512mb dram/2GB for PRO). It takes time to take the ~12GB of vram assets and figure out what would work fine in dram, and if you are under a deadline, you might be putting things in vram that would work just fine in dram. And that would probably be due to something like ....#2 (*1)

2) Games are becoming very expensive and labour intensive to make, and fiscal quarters are short, and since you can always patch a game afterwards, upper management often likes to push them out the door prematurely to please shareholders ( tbf it is important to please shareholders if you are a public company... but you also have to please customers... its a balancing act). Anyway, its not always the publisher/higher level developer positions making these decisions.... but usually it is. I mean you obviously need to have some oversight to keep scope from creeping out of hand. But... yeah.... still... the amount of games released unpolished because of deadlines is getting very high because of the patch later mentality.

3) Development teams being put on projects their skillsets are not most optimal for, for example putting a team used to making single player narrative games onto a live service or vice versa.

4) Teams becoming too big. Yes, adding heads does increase productivity.... to a point. But eventually it can start to drag things down if it gets out of hand. Miscommunications and misunderstandings are bound to arise and cause problems. You're also gonna have to deal with disagreements on vision and what-have-you, and other wastes of time that could be better handled by a small, united, and skilled team.... which brings me to..... #5

(warning, the following is totally just speculation)

5) I wonder how many skilled developers have either retired or left gaming to work somewhere else. Skilled programmers are needed all over the place, and you may be tempted to a faster growing sector if you feel you aren't being paid enough or are treated poorly ( a common story in game development environments).

But perhaps with some new developers, all they have ever known is higher level languages, which means they may not grasp the way their higher level commands are executed by the cpu and gpu as well as somebody who has been doing this for a long time and say used to code in assembly, I feel like you would have a better understanding of how to get the most out of the hardware. Again, this is just a guess, and no shade to developers, all I've ever known is higher level languages too, and when I did used to make games as a hobby, I'm sure in 1s and 0s it was as unoptimized as they come. ( but didn't matter since they were just small 2D games - and not to make money either)

*1) perhaps, with switch 2 coming out with only 12GB shared memory (vs 16), and only dram at that, may very well help lower the target and force tighter programming from the get-go for multi-platform games. Though this depends on the switch 2 becoming a hit, and publishers deeming it capable of running newer AAA games. Though abusing dlss could very well be the answer to this, that and other 'easy' solutions like cutting the framerate. So maybe its wishful thinking. I still wouldn't blame developers themselves though, they are only cogs in a machine, most of the time, doing the best they can with what they are given in time and resources.

Because 8GB was midrange a decade ago and 1440p displays are the new standard.
Yeah you know, I've been frustrated for a long time that its nearly impossible to find cheap 22 - 24" gaming monitors that are 1440p (usually they are 27" or bigger, which means lower pixel density, which feels like a downgrade), and thats part of the reason why I'm still rocking a 10 year old monitor, but the other day I went on amazon, and all of a sudden, they were there!!! Not one, but many. Like, I swear I just checked a few months ago (okay maybe it was 6 months... or longer) but anyway point is, it seems in my region anyway, they went from non-existant, to pervasive, very fast.
 
Last edited:
Because 8GB was midrange a decade ago and 1440p displays are the new standard.


Wrong and wrong.

Its the economy, stupid.

Time to market being lower = less dev cost (time x # of devs = $$$), product to market sooner means a lower time between investment and return on investment, and therefore a reduction of risk, too.

This is also why games come out as early access. Its funding development. Optimization is always, only, ever, done to make games run on targeted hardware for the specific target market for the product in question. Or, to expand the target devices you want to sell stuff on. It is specifically NOT ever done so you can run the game at 90 FPS instead of 60. Some games don't even bother with that demand to begin with and just lock the whole affair at 60.

Its a simple matter of cost/benefits and moving the maximum palatable amount of cost towards the end-user, while not making bad press. Sometimes it goes wrong, but far more often we sugar coat the bugfest with 'But game is so gud' or just simple fan-camp behaviour. We are that easy to divide and conquer even in the face of a poorly performing product ;) The ENTIRE SELLING POINT of the Unreal Engine right now is exactly this approach to workflows: they move the expense of optimization towards a higher hardware requirement. Look at the Oblivion Remastered release. Just running the engine on top of an old, shaky framework. It works. Its super easy money.

Really, the above is the bottom line for all things in life you don't understand or that don't make sense when it comes to any product or service: its about money. If it looks or feels weird, you know why: someone's making money off it, and it ain't you.
 
Last edited:
Yup, technically it's still the cheapest new generation card and so will naturally be the bulk of sales.
+

NVIDIA told HUB the reason they're doing 8GB again is because the 4060 sold very well and is popular on hardware surveys so obviously 8GB isn't an issue for consumers. :roll:
Yeah, right ;)

Yeah, just ignore it's the cheapest card in the line-up and most buyers probably don't know the implications of VRAM. With these conditions, it's supposed to sell well, the issue is the obvious planned obsolescence on a modern "RTX" card.
12GB is the bare minimum, period.
I do mod my AAA RPG titles, a PROPER modding experience (nice-looking NPCs, detailed textures plus distant LODs) needs at least 12GB VRAM

Unless NVIDIA gives the green light to the 12GB 5060 SUPER (128-bit 3GB memory modules) at CES 2026, I simply can't recommend/suggest xx60 series with 8GB VRAM to anyone.

Before this it was the 3070 8GB, how poorly that's aged.
Tell me about it.
 
i don't recall there 1) being a game and 2) if there was it was certainly not "let me ask a question no else asked in the previous conversation and then answer it" with a wall of irrelevant text to the previous question.
The question is "name one Nvidia series where the the X060ti was the last card in the series"
You answered "When was the last series where the xx60 series had a ti release before the base model released"
"Before the base model released" was never asked by me but by all means you have my permission to continue posting in your "game" by yourself of asking questions no asked and then answering them



yeah :rolleyes: just above you

Your response was to someone claiming that the xx60 ti offerings this generation sucked, and were basically garbage.

The retort back was that you asked when the xx60ti version was the last card in a generation...well after the xx60 non-ti was confirmed by Nvidia. You then died on the hill of "well, I said ti version so because there's a non-ti version you're wrong."

The retort to your retort was that Nvidia has constantly been shifting their product line up. They are artificially inflating their offerings by offering a ti option day one...whenever they used to offer a full stack in the past, and then offer a ti/super optional refresh as a mid-cycle refresh. IE, xx60 and xx70 launch with either an xx60ti or xx60super refreshing mid-launch with about the pricing of the non-ti/super. It bumped up performance a little bit, cost about the same, and filled the stack fully between the existing rungs and the new half rungs. Of course, today they release day one with a 5070 and 5070ti, killed the 5030 and 5050, and now those more affordable entry level cards...do not exist. All of this is in service to charging $3000 for the highest end cards...which I wish was a joke but is where we are. That shift upwards, bastardizing the name of ti and super to artificially segment numbers higher without admitting to simply not offering something reasonable (again, inflating price without offering inflated performance that would have matched in prior generations) is Nvidia being greedy, and it's why the 5060 and 5060ti are both damp squibs of disappointing "this would have been a 1040 and maybe a 1050 level offering 4 generations ago but is now
being sold as the 1060 and 1060ti to artificially inflate numbers and perceptions of value relative to nothing...because we can" from Nvidia.


Of course, you're welcome to pretend like their 5060 is something that will somehow give their 5060tis (yes, two differentiated by VRAM count, which is their not so subtle way of saying "this is the card we think people will pay a premium on for the VRAM to work in the future as our entry level option." Almost like their 1060 3 and 6 GB. Or their 3080 8 GB, 10GB, and ti 12GB. Or their 8 and 16 GB offerings on the 5060ti. Almost like Nvidia knows they are choking their product lines by offering finite VRAM...and after creating that problem they'll sell you the solution. Translated into normal human speech, the double middle finger of sales that only those confident that you have limited other options can offer...which is accurate today but will breed contempt such that when the AI boom crashes they will have to explain to customers why it was acceptable that they were jack ***** and why we should still buy from them...when they thought it was fine to give any potential at affordable gaming a pass...because why not cut out everyone that's not willing to be raked over the coals on pricing?
 
  • Like
Reactions: NSR
12GB is the bare minimum, period.
I do mod my AAA RPG titles, a PROPER modding experience (nice-looking NPCs, detailed textures plus distant LODs) needs at least 12GB VRAM
Oh yeah, this conversation is often so centred around textures that I forget all of the other things VRAM is good for. Intel's B580 should have been the start of a new standard.
Tell me about it.
What's more, the 3070 and the 6800 are both about as fast as the 5060 is now. However, 6800 will still survive this next generation thanks to its 16GB headroom whilst 3070 users are probably looking to upgrade.
 
Wrong and wrong.

Its the economy, stupid.

Time to market being lower = less dev cost (time x # of devs = $$$), product to market sooner means a lower time between investment and return on investment, and therefore a reduction of risk, too.

This is also why games come out as early access. Its funding development. Optimization is always, only, ever, done to make games run on targeted hardware for the specific target market for the product in question. Or, to expand the target devices you want to sell stuff on. It is specifically NOT ever done so you can run the game at 90 FPS instead of 60. Some games don't even bother with that demand to begin with and just lock the whole affair at 60.

Its a simple matter of cost/benefits and moving the maximum palatable amount of cost towards the end-user, while not making bad press. Sometimes it goes wrong, but far more often we sugar coat the bugfest with 'But game is so gud' or just simple fan-camp behaviour. We are that easy to divide and conquer even in the face of a poorly performing product ;) The ENTIRE SELLING POINT of the Unreal Engine right now is exactly this approach to workflows: they move the expense of optimization towards a higher hardware requirement. Look at the Oblivion Remastered release. Just running the engine on top of an old, shaky framework. It works. Its super easy money.

Really, the above is the bottom line for all things in life you don't understand or that don't make sense when it comes to any product or service: its about money. If it looks or feels weird, you know why: someone's making money off it, and it ain't you.
Yes yes, we get it, devs push for an FPS target as quick as possible using every tool at their disposal and don't optimize further because publishers then push it out the door before it's finished, we've heard it all before. Doesn't excuse the lack of culling and LODs in games like the SH2 remake. It can be a lot of things.
 
Yes yes, we get it, devs push for an FPS target as quick as possible using every tool at their disposal and don't optimize further because publishers then push it out the door before it's finished, we've heard it all before. Doesn't excuse the lack of culling and LODs in games like the SH2 remake. It can be a lot of things.
Its no excuse at all. But we do consider it one as we buy the games day one. Gamers have a strong tendency to spend first and ask questions later... because deep down they just wanna play. Devs know this.
 
12GB is the bare minimum, period.
Just a clarifying question, do you think no new 8GB GPU should be produced now, no matter the price and relative GPU performance?
 
Just a clarifying question, do you think no new 8GB GPU should be produced now, no matter the price and relative GPU performance?
I think 32g is the bare minimum. Modded Skyrim at 8k uses more than 24.

You know, people trying to force their personal needs over everyone elses is a sight to behold..
 
Status
Not open for further replies.
Back
Top