• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5060 Ti 16 GB SKU Likely Launching at $499, According to Supply Chain Leak

When your customers are stupid and willing to pay 50-100% inflated prices over a fake MSRP, the most stupid thing to do is, be honest and not take advantage of that stupidity to transform it to higher profit margins.


This is what happens when Youtubers side with the monopoly. Gamers Nexus was (in my opinion) the channel Nvidia chose to use, thanks to their credibility, to silence everyone saying that the 12pin connector is a fire hazard. Steve was singing and defending the "user error" narrative for months after that video.

How paying 499$ for GPU makes ppls stupid?
Im sure 90% of gamers are poor ppls,why else there is so much whining about prices?
 
How paying 499$ for GPU makes ppls stupid?
Im sure 90% of gamers are poor ppls,why else there is so much whining about prices?
Maybe I am wrong, maybe paying $400-$500 for something that was $150-$250 8-10 years ago is in fact normal. Because inflation for example. I am pretty sure a 2-3% inflation for 8-10 years amount to 100% price increase.
 
They moved the product stack but claimed software made it so much better
To be fair though the software does make it much better, so much so that I'm willing to pay that premium. RDNA 4 STILL doesn't have ROCm support and it's been over a month for example. Problem with NVIDIA is their VRAM and pricing.
When your customers are stupid and willing to pay 50-100% inflated prices over a fake MSRP, the most stupid thing to do is, be honest and not take advantage of that stupidity to transform it to higher profit margins.
Weren't you one of the guys saying the 9070 XT should have an MSRP of $700 and it was a good price? Yeah, they're DEFINITELY not listening to your "advice":roll:
I am pretty sure a 2-3% inflation for 8-10 years amount to 100% price increase.
LMAO
 
Maybe I am wrong, maybe paying $400-$500 for something that was $150-$250 8-10 years ago is in fact normal. Because inflation for example. I am pretty sure a 2-3% inflation for 8-10 years amount to 100% price increase.

The prices are what they are, people aren't stupid for paying what they have to pay. Are people stupid for paying 2-3x as much for a used car nowadays as they did five years ago? The world changed since 2020 and hasn't changed back yet. In fact with the tariffs at least in the US the current prices could soon be a pleasant memory.
 
Some people are just living in the past.. completely out of touch. Go buy an 8-10 year old GPU and let me know what you think.

I got my 4070Ti on sale for 640USD when they came out during an Amazon sale.. so to me this seems a bit expensive for what you are getting.
 
Are you saying gaming performance is never better on the 16GB card vs the 8GB? Because there are multiple cases where there is a big difference in minimums and/or averages. Not everyone who bought the 16GB is just stupid, some decided it was worth paying a bit extra to avoid the cases where an 8GB card came up short (yes even at 1080p). It costs a bit more but you can sell or trade it in for more when you come to upgrade. It just seems to have become an accepted truth that the 16GB card is just a con, which isn't remotely true.

Hopefully we'll see more games used in GPU reviews on this site which use over 8GB, and show the minimums too, so we can get the full picture. TLOU1 was one such game where the 16GB card was notably quicker than the 8GB even at 1080p, I expect TLOU2 may be the same.
No, I'm saying the people that are informed and know about the value of the cards are well aware that the 16GB 4060Ti is just a poor buy.
The 4070 and 4070 Super offered a serious performance upgrade with "enough" VRAM.
The 4060 had a usecase for being a low powered card where 8GB was generally enough for the target audience.
The 4060Ti 8GB was DOA since day 1 for the concerning expected lifespan of the card.
The 4060Ti 16GB was DOA since day 1 for the very poor value and the fact that it upsells you to the 4070.

In general I think less of people that have a 4060Ti unless it's the 16GB card and they bought it because they truly needed the 16GB.
Futureproofing I will not accept considering the performance tier of the card
 
Nah.. I have a 3070Ti running @ stock in my other system.. its fine. Of course you can do better.. but for what it is it was/is pretty decent.
 
Yes, and the 5060 should replace the venerable 1030 as a 5030 and should be sold for sub-100 dollars, yes. See, I too can male ridiculous statements that have no basis in reality.
Keeping it real though - sure, what NV is doing IS questionable at best, but anyone who did NOT expect that with further node shrinks there will be a rearranging of the stack for all GPU makers hasn’t been paying attention. Hell, back in the day of, say, Fermi the x70 card was based on the cut-down version of the top chip. This obviously isn’t tenable anymore. Not for the (theoretical) price of 550 bucks.
Go watch the Gamers Nexus video and look at the spec sheets. The 5070 spec sheet is more in line with what would of been a 50 series card even two or 3 gens ago. This isn't hyperbole but based on actual facts and specs.

Think the video is called something along the lines with Nvida Shrinkflation.
 
No, I'm saying the people that are informed and know about the value of the cards are well aware that the 16GB 4060Ti is just a poor buy.
The 4070 and 4070 Super offered a serious performance upgrade with "enough" VRAM.
The 4060 had a usecase for being a low powered card where 8GB was generally enough for the target audience.
The 4060Ti 8GB was DOA since day 1 for the concerning expected lifespan of the card.
The 4060Ti 16GB was DOA since day 1 for the very poor value and the fact that it upsells you to the 4070.

In general I think less of people that have a 4060Ti unless it's the 16GB card and they bought it because they truly needed the 16GB.
Futureproofing I will not accept considering the performance tier of the card

Just seems like a lot of snobbery and it being cool to hate the 4060 series, you see a lot of that online, although the market has spoken and the cards are very popular.

Of course you can always pay more to get the next tier card up - a substantial amount more and not everyone needs the extra performance. What determines 'value' is up to each of us to decide for ourselves.
 
Just seems like a lot of snobbery and it being cool to hate the 4060 series, you see a lot of that online, although the market has spoken and the cards are very popular.

Of course you can always pay more to get the next tier card up - a substantial amount more and not everyone needs the extra performance. What determines 'value' is up to each of us to decide for ourselves.
You can ask anyone with some authority in this space and they will tell you the exact same thing.
Since you have a 4060Ti 16GB you're very biased and there's nothing I can do to sway you, so I'll leave it at that.
I'd also like to remark that I think that the 4060 is a fine card. I have a problem with the 4060Ti only.
 
Just seems like a lot of snobbery and it being cool to hate the 4060 series, you see a lot of that online, although the market has spoken and the cards are very popular.

Of course you can always pay more to get the next tier card up - a substantial amount more and not everyone needs the extra performance. What determines 'value' is up to each of us to decide for ourselves.
4060 is popular just because of price. Multiple cards available around $329-$350. Some have been under $300 at time. Regardless of the fact it wasn't a great card, people looking to upgrade typically spend under $400 for a graphics card. I have been building and missing around with PC's since before I was 10 years old and took apart a 486 DX2. I have never spent over $379 for a GPU and that was for my RX 5700 that I bios flashed with a 5700 XT bios. That was the last GPU I purchased.
 
You can ask anyone with some authority in this space and they will tell you the exact same thing.
Since you have a 4060Ti 16GB you're very biased and there's nothing I can do to sway you, so I'll leave it at that.
I'd also like to remark that I think that the 4060 is a fine card. I have a problem with the 4060Ti only.

Tell me what though? This tier of card used to be better value? It would be better if it was faster and/or cheaper? You can get faster cards if you pay more? I know.
 
Go watch the Gamers Nexus video and look at the spec sheets. The 5070 spec sheet is more in line with what would of been a 50 series card even two or 3 gens ago. This isn't hyperbole but based on actual facts and specs.

Think the video is called something along the lines with Nvida Shrinkflation.
I watched it. The premise is inherently flawed. The 4090 and 5090 are monster chips that are wholly incomparable to anything that came before and taking them as baseline is disingenuous at best. If he dropped them and used the x80 cards for Ada and Blackwell as the measurement of flagship (as he honestly should, the last two x90s are consumer cards in name only) his whole argument would fall apart and the “downtrend” would become a straight line. Don’t get it twisted, it’s the 4090 and 5090 that are the outliers, not the other cards. It’s those two that are warping everything around them.
 
It'd be the only GPU in production near the $500 price market. But if it's like the others it's basically a paper launch and very few people will be able to get one. On the bright side, a smaller GPUs means more chips per wafer which means more of the people desperate for literally any GPU will get one and reduce the demand on the market a little.
Did AMD stop producing the 7800xt yet?
That's $500 and seemingly available still.
 
You mean $500 for an RTX 5050 ? $300 for the 16GB or DOA.
 
I watched it. The premise is inherently flawed. The 4090 and 5090 are monster chips that are wholly incomparable to anything that came before and taking them as baseline is disingenuous at best. If he dropped them and used the x80 cards for Ada and Blackwell as the measurement of flagship (as he honestly should, the last two x90s are consumer cards in name only) his whole argument would fall apart and the “downtrend” would become a straight line. Don’t get it twisted, it’s the 4090 and 5090 that are the outliers, not the other cards. It’s those two that are warping everything around them.
Good thing the comparison is based on percentage of what you get on lower end SKUs relative from the full uncut chip. You make it sound like Titans were never a thing before.
 
Good thing the comparison is based on percentage of what you get on lower end SKUs relative from the full uncut chip. You make it sound like Titans were never a thing before.
Not like this, no. The Titans were barely any faster/had fuller chips than the x80ti’s of their architectural generations and sourced from the same chip and those chips spec-wise are much closer to your 4080/5080 class cards. Oddities like Titan Z (dual-chip) and Titan V (Volta never had consumer cards) notwithstanding. As I said - the 4090 and 5090 are unprecedented in what they are. Directly comparing them to anything before is a fools errand. Unless, of course, we are going into the argument of how the 5090 should be the 5080, but then everything crumbles from there since there is no way it reasonably CAN be. Like, seriously, if you people can’t see immediately that the logic of Steves comparison is extremely flawed here and he’s just being a clout goblin on this one for engagement - I can’t help you.
 
Not like this, no. The Titans were barely any faster/had fuller chips than the x80ti’s of their architectural generations and sourced from the same chip and those chips spec-wise are much closer to your 4080/5080 class cards. Oddities like Titan Z (dual-chip) and Titan V (Volta never had consumer cards) notwithstanding. As I said - the 4090 and 5090 are unprecedented in what they are. Directly comparing them to anything before is a fools errand. Unless, of course, we are going into the argument of how the 5090 should be the 5080, but then everything crumbles from there since there is no way it reasonably CAN be. Like, seriously, if you people can’t see immediately that the logic of Steves comparison is extremely flawed here and he’s just being a clout goblin on this one for engagement - I can’t help you.
Alright. explain in what way are they unprecedented, because looking at die sizes, they're all fairly large: 2080 Ti/Titan RTX 754mm², 4090 609mm², 5090 750mm². The only unprecedented thing seems to be the performance gap that NV created between the flagship and the next card down the stack.
 
Alright. explain in what way are they unprecedented, because looking at die sizes, they're all fairly large: 2080 Ti/Titan RTX 754mm², 4090 609mm², 5090 750mm². The only unprecedented thing seems to be the performance gap that NV created between the flagship and the next card down the stack.
Die size is a useless metric when comparing chips on different nodes. More useful is transistor count. The last Titan had 35% more transistors in its chip than the one chip lower. The difference between GB202 and GB203 is more than DOUBLE. Do you get the picture yet? And sure, one can spin this as NV bad and them sandbagging and that the 5090 should be the 5080 and bla bla bla, but the reality is that in no universe a chip the caliber of GB202 could have ever been a reasonable enough choice for a sanely priced flagship card. Full stop. It’s a datacenter/professional GPU moonlighting as a vaguely prosumer-ish card that can be ostensibly bought for gaming if one has more money than sense.
 
The 5060 ti 8GB will be the next queen of gaming. When neural texture compression feature gets enabled in games in a few months, together with multi frame generation 4x, it will be like having a $1600 4090 for $400.

Which means the 4000 series cards market value will fall down heavily, and their owners will fight back in forums saying this is not true, alongside people with short sighted vision, same as when serie 4xxx was released with frame generation. History will repeat itself once again.
 
Die size is a useless metric when comparing chips on different nodes. More useful is transistor count. The last Titan had 35% more transistors in its chip than the one chip lower. The difference between GB202 and GB203 is more than DOUBLE. Do you get the picture yet? And sure, one can spin this as NV bad and them sandbagging and that the 5090 should be the 5080 and bla bla bla, but the reality is that in no universe a chip the caliber of GB202 could have ever been a reasonable enough choice for a sanely priced flagship card. Full stop. It’s a datacenter/professional GPU moonlighting as a vaguely prosumer-ish card that can be ostensibly bought for gaming if one has more money than sense.
I see your point about transistor count with the AD102 which had the more than double jump (169%). Although by the same metric, GB202 is way below (20%) the increase between GP102 to TU102 (57%) to GA102 (52%). Then again, the performance differences don't match with transistor counts either.

I don't know what this means in the grand scheme of things, and perhaps GNs chart isn't to be taken as some sort of gotcha, but it sure feels like bang for buck has gotten worse. Not to mention all these AI upscalers and frame generators that muck the ability of getting a clear picture (pun intended).
 
Last edited:
I watched it. The premise is inherently flawed. The 4090 and 5090 are monster chips that are wholly incomparable to anything that came before and taking them as baseline is disingenuous at best. If he dropped them and used the x80 cards for Ada and Blackwell as the measurement of flagship (as he honestly should, the last two x90s are consumer cards in name only) his whole argument would fall apart and the “downtrend” would become a straight line. Don’t get it twisted, it’s the 4090 and 5090 that are the outliers, not the other cards. It’s those two that are warping everything around them.
You can throw the 90 series cards out of the conversation. That part of the argument is irrelevant. It is the actual Cuda cores and what not. They line up with the 50 series two or 3 generations ago. The last two gens they have moved the whole product stack down a segment each gen.
 
I see your point about transistor count with the AD102 which had the more than double jump (169%). Although by the same metric, GB202 is way below (20%) the increase between GP102 to TU102 (57%) to GA102 (52%). Then again, the performance differences don't match with transistor counts either.
Yeah, the Blackwell low increase is fairly easy to understand - no new node. AD102 was already at the limit, the yields already were horrible. The moment it became known that Blackwell will stay on the same node as Ada (if optimized) pretty much everyone who understands anything saw the writing on the wall - the increase won’t be significant. Most of the 5090s performance over the 4090 is from some architectural improvements and raw increase in power.
You are right about diminishing performance per dollar though, never argued against that.

You can throw the 90 series cards out of the conversation. That part of the argument is irrelevant. It is the actual Cuda cores and what not. They line up with the 50 series two or 3 generations ago. The last two gens they have moved the whole product stack down a segment each gen.
This is hilariously wrong on both counts. First - you cannot throw the x90 away from the conversation when Steve takes it as a baseline. Secondly - CUDA core number is incomparable between pre and post-Ampere. That was when NV overhauled the architecture entirely and split the core in two doubling the numerical count. 2560 Turing cores is not the same even remotely as 2560 Ampere/Ada/Blackwell cores. Another reason, by the way, why the GN video is clout goblin bogus. Worse yet, he KNOWS that the architecture changed, yet he still makes the point.
 
What a fkng joke!
5060ti is a Beefed up 4060ti.

I paid 484.86$ With tax from best buy walk in purchase on January 25th 2024. So 442$ was the Net price.

500$+ is pushing it for 1080P though. I wasn't keen on the price I paid for the 4060ti.

The 4060 ti is 2 years newer than an RX6700 XT and is 30% faster. The 6700XT released in 2021 at 480$.

So maybe 500 bucks for a 5060 ti isn't really a joke. It'll be between a 4060 ti and straight 4070. The 4070 is a 600$++ card...
 
Back
Top