• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5060 Ti 16 GB SKU Likely Launching at $499, According to Supply Chain Leak

probably speculative at this point but this MSRP is driven by the current 50% tariff situation out of China.

average consumer gave in to paying more taxes than the highest income bracket, on something as mundane as a midrange GPU.
 
Yeah, the Blackwell low increase is fairly easy to understand - no new node. AD102 was already at the limit, the yields already were horrible. The moment it became known that Blackwell will stay on the same node as Ada (if optimized) pretty much everyone who understands anything saw the writing on the wall - the increase won’t be significant. Most of the 5090s performance over the 4090 is from some architectural improvements and raw increase in power.
You are right about diminishing performance per dollar though, never argued against that.


This is hilariously wrong on both counts. First - you cannot throw the x90 away from the conversation when Steve takes it as a baseline. Secondly - CUDA core number is incomparable between pre and post-Ampere. That was when NV overhauled the architecture entirely and split the core in two doubling the numerical count. 2560 Turing cores is not the same even remotely as 2560 Ampere/Ada/Blackwell cores. Another reason, by the way, why the GN video is clout goblin bogus. Worse yet, he KNOWS that the architecture changed, yet he still makes the point.
Can't compare to the 4090 and 5090 (but not the 3090 for some reason)? Okay.
Can't compare between pre- and post-Ampere CUDA cores? Okay.
Can't compare using die size? Okay...
Can't throw out the x90 away from the conversation because Steve takes it as a baseline?
...
What point are you trying to make here man? That Steve's a "clout goblin" or something? Why don't you tell us what we CAN compare with between the past ten years of GPU generations, then?

So maybe 500 bucks for a 5060 ti isn't really a joke. It'll be between a 4060 ti and straight 4070. The 4070 is a 600$++ card...
It better be a joke, NVIDIA already tried it with the 4060 Ti at that price and had to discount it. The RX 7800 XT released for $500 already and it doesn't look like the 5060 Ti will beat that card at the moment...
 
That's honestly too much when 5070 has $549 MRSP.
If It was $449 max, then with the extra 4GB Vram It would be somewhat acceptable.
This card is most likely going to be slower than a 2020 era 3080. Crazy how slow the mid-range progression has been, if you had told me in 2020 that not only was a future 5060ti going to be 500 dollars but it will also be slower than a 3080 I'd tell you you're smoking crack. Now it's just the new normal :(
 
Did AMD stop producing the 7800xt yet?
That's $500 and seemingly available still.
I assumed production had stopped. I'm not sure. It's "Temporarily Out of Stock" and "Back-Ordered" on B&H, "Out of Stock" at Newegg, and I can't find it at Amazon. But I also can't find any statement that production had stopped.
 
It better be a joke, NVIDIA already tried it with the 4060 Ti at that price and had to discount it. The RX 7800 XT released for $500 already and it doesn't look like the 5060 Ti will beat that card at the moment...
MSRP and what it's sold for is two completely different things.

7800 XT is in the 600$ plus range right now.

9070 XT is 1000$ ++

5060 ti (msrp) 499$

Prices = :kookoo:
 
MSRP and what it's sold for is two completely different things.

7800 XT is in the 600$ plus range right now.

9070 XT is 1000$ ++

5060 ti (msrp) 499$

Prices = :kookoo:
I suppose if you're a goldfish a $500 MSRP would make sense. Would be funny for NVIDIA to do the AMD method again. Definitely not gonna buy it though.
 
Just saw this video show on youtube home page:

I'm so glad that people listen to GN saying things that they don't listen me saying :laugh:

Maybe I am wrong, maybe paying $400-$500 for something that was $150-$250 8-10 years ago is in fact normal. Because inflation for example. I am pretty sure a 2-3% inflation for 8-10 years amount to 100% price increase.

If the government claims a 3% inflation rate, real inflation is probably on the 15 to 20% scale. Sometimes more. And when it comes to luxury goods... definitely more.
 
I suppose if you're a goldfish a $500 MSRP would make sense. Would be funny for NVIDIA to do the AMD method again. Definitely not gonna buy it though.
None of the GPU pricing makes sense. But people are buying. Or the 5000 series wouldn't be out of stock for weeks and weeks. It's got AI capabilities bro. AI Tax season is here! The timing so good. wink wink
 
I see your point about transistor count with the AD102 which had the more than double jump (169%). Although by the same metric, GB202 is way below (20%) the increase between GP102 to TU102 (57%) to GA102 (52%). Then again, the performance differences don't match with transistor counts either.

I don't know what this means in the grand scheme of things, and perhaps GNs chart isn't to be taken as some sort of gotcha, but it sure feels like bang for buck has gotten worse. Not to mention all these AI upscalers and frame generators that muck the ability of getting a clear picture (pun intended).

GP102 and TU102 should not even be compared knowing that Turing was the first RTX lineup including RT and Tensor Cores, which Pascal and previous generations didn't have.

And yes die size doesn't mean much nowadays, the 5090 has a die size that is almost twice the size of the 5080 but is nowhere near 2x as powerful ! Usually around 50% and sometimes (exceptionally) up to 75% faster at 4K with heavy RT or PT.
 
None of the GPU pricing makes sense. But people are buying. Or the 5000 series wouldn't be out of stock for weeks and weeks. It's got AI capabilities bro. AI Tax season is here! The timing so good. wink wink
Real retail pricing of cards doesn't make sense. MSRP is supposed to make sense. If a $500 MSRP for a 5060 Ti is a good idea then a $1000 MSRP for a 9070 XT is a better one because that's what you're saying people are buying it for. Obviously this isn't the case.
 
If AMD plays this right with RX 9060 they're gonna take the entire mid segment. Just like they did the upper mid to high end with RX 9070. Those are selling like hot cakes.

NVIDIA is just delusional at this point and I hope wind takes the piss to their face. Why not just make bottom models cost 999 at this point. It's all ridiculous.
 
Why AMD fanboys happily buy overpriced AMgreeD stuff but complains about high prices in Nvidia threads?
 
4060 series was the dud (below)...

Late night math: About 43% of Steam users run a GPU slower than 4060 Ti.

So the 5060ti will be a substantial performance upgrade to at least 1 out of 3 gamers.


4060ti_dud.png
 
Just saw this video show on youtube home page:

Anyone that thinks a 5070 is a "5050" is just dumb.

Die size reference since GTX600 series:

GTX670: GK104 @ 294 mm². 1 SMX disabled. // GTX680 was full die GK104
GTX770: GK104 @ 294 mm². 1 SMX disabled. // GTX780 was cutdown GK110 (561 mm²) -3SMX
GTX970: GM204 @ 398 mm². 3 SMM disabled // GTX980 was full die GM204
GTX1070: GP104 @ 314 mm². 5 SM disabled // GTX1080 was full die GP104
*RTX2070: TU106 @ 445 mm². 0 SM disabled // RTX2080 was cutdown TU104 (545 mm²) -2 SM*
RTX3070: GA104 @ 392 mm². 2 SM disabled // RTX3080 was cutdown GA102 (628 mm²) -16 SM
RTX4070: AD104 @ 294 mm². 14 SM disabled // RTX4080 was cutdown AD103 (379 mm²) -4 SM
-
RTX5070: GB205 @ 263 mm². 2 SM disabled // RTX5080 is full die GB203 (378mm²)
RTX5070 TI: GB203 @ 378mm². 14 SM disabled // RTX5080 is full die GB203

Most 70/80 cards fell into the 300-400mm² range historically. Which would make 5070 TI more of a true "5070".

Only outlier and argument being 2070's full die TU106, but the whole TSMC 12nm run was overly large in general.

2080 TI's TU102 is bigger than 5090's GB202 slightly.. but it's no argument. The mid/lower end die used for 2060/2070 was 445mm² on said dated TSMC 12 node. Much different situation.

The current flagship die is a lot bigger relevant to legacy generations, especially since it's on TSMC 4N. Has the gap from absolute flagship gotten progressively worse? Yes, but steve and everyone else are acting like the 5070 is a 150mm² die or some shit.

5070 closely resembles the same config as the RTX3060. Both are in the 260-270mm² range with 2 SM disabled and shared 192 bit bus. Theres obviously way more SM's due to density capability (28 vs 48) but thats besides the point.

300-400mm² was always the "performance/mid die" segment. x60 series historically hovered in the 200mm² range until RTX20. I guess NVIDIA wanted to go back to that since x60 on RTX40/50 will both be sub 200mm²... lol

The way the chart is presented is so freaking skewed.. especially because GB202 is ABNORMAL. Most flagships designs peaked around 500-600mm² since the late 2000s.

Eff nvidia, but this is just brain rot.
 
Last edited:
If the government claims a 3% inflation rate, real inflation is probably on the 15 to 20% scale. Sometimes more. And when it comes to luxury goods... definitely more.
A....nope! 15%-20% you have in monopolies, not because of the bad government lying to you. Have a look at TVs where there are so many companies competing.
In 10 years time we are buying higher refresh rates, higher resolutions, double the size TVs with the same money. If there was a monopoly in TVs, we would be buying the same inches with some extra features at probably +50% the old prices. It's not inflation, it's the monopoly in GPUs driving prices up.

The prices are what they are, people aren't stupid for paying what they have to pay. Are people stupid for paying 2-3x as much for a used car nowadays as they did five years ago? The world changed since 2020 and hasn't changed back yet. In fact with the tariffs at least in the US the current prices could soon be a pleasant memory.
The prices are what they are because people pay whatever a certain company asks. And with the last series they even pay much higher than MSRP. You know what that company thinks today? "Well, we can price the next series even higher. People are willing to buy anyway". So yes, considering GPUs, people are stupid to celebrate a $750 MSRP and then go happily and pay $1000. I mean, either buying a GPU is a matter of life or death, or people are stupid. As for cars if your numbers are accurate and I do doubt that they are, I guess Brexit played a role in UK specifically, COVID that drove people to abandon mass transportation and buy cars for safety, also created higher demands. Also a car can be a significant investment for most people, maybe even a way to have or keep their jobs. Paying 50% over MSRP to move the quality setting from High to Ultra is stupid, period.

No shortage of pretentious douchery here, ha. Customers are stuck to pay a premium if they want the good stuff, Einstein, they don't really have a choice.

Jesus...
Why are customers stuck to pay a premium? Is a GPU vital for their survival? Is someone pointing a gun at them? Is buying a new GPU today at a 50% price over MSRP a reason to work and live for? Well, Einstein? What is it?

People whining one day about GPU pricing, only the next day justifying, or even more defending aggressively those prices and also excusing everyone willing to pay those prices, is somewhat contradictory, don't you think?
 
This turd should cost at most $350. Considering it only has 5% more cores, it will probably only be 5-10% faster than the 4060ti 16gb. So 5% faster for the same price? Nvidia and more importantly their consumers are utter idiots with no brains.
 
This turd should cost at most $350. Considering it only has 5% more cores, it will probably only be 5-10% faster than the 4060ti 16gb. So 5% faster for the same price? Nvidia and more importantly their consumers are utter idiots with no brains.
Yes, but you know, MULTI Frame Generation and stuff, or "As fast as RTX 4080", JH would probably say.
 
A....nope! 15%-20% you have in monopolies, not because of the bad government lying to you.

Could have fooled me, if not for that grocery run yesterday :shadedshu:

I was specifically talking about inflation in itself as a whole, well aware of the pressure from monopolies, trends, etc. though
 
Are you saying gaming performance is never better on the 16GB card vs the 8GB? Because there are multiple cases where there is a big difference in minimums and/or averages. Not everyone who bought the 16GB is just stupid, some decided it was worth paying a bit extra to avoid the cases where an 8GB card came up short (yes even at 1080p). It costs a bit more but you can sell or trade it in for more when you come to upgrade. It just seems to have become an accepted truth that the 16GB card is just a con, which isn't remotely true.

Hopefully we'll see more games used in GPU reviews on this site which use over 8GB, and show the minimums too, so we can get the full picture. TLOU1 was one such game where the 16GB card was notably quicker than the 8GB even at 1080p, I expect TLOU2 may be the same.
The downside of the incoming 5060Ti, as well as the 16Gb 4060Ti, and even the RX 7600XT isn't that they are 128-bit, 8-lane PCIe cards that *might* not be able to effectively leverage their 16Gb of VRAM, but rather, simply their price. It seems that some people view the VRAM amount as somehow being a negative for a reason other than the increased cost. As if these cards are 'unbalanced' in some way by that.

As I see it, it isn't a downside that they have 16Gb, but rather a benefit that they have more than 8. They're just too expensive.
 
pathetic. Just get a used 3080 or something guys
 
The prices are what they are because people pay whatever a certain company asks. And with the last series they even pay much higher than MSRP. You know what that company thinks today? "Well, we can price the next series even higher. People are willing to buy anyway". So yes, considering GPUs, people are stupid to celebrate a $750 MSRP and then go happily and pay $1000. I mean, either buying a GPU is a matter of life or death, or people are stupid. As for cars if your numbers are accurate and I do doubt that they are, I guess Brexit played a role in UK specifically, COVID that drove people to abandon mass transportation and buy cars for safety, also created higher demands. Also a car can be a significant investment for most people, maybe even a way to have or keep their jobs. Paying 50% over MSRP to move the quality setting from High to Ultra is stupid, period.

Nothing to do with Brexit, everything to do with Covid and the knock-on effects, it has affected the whole world, I thought it was common knowledge. You can find many articles from the US and elsewhere about skyrocketing used car prices. Also it is more common to see much older cars on the road nowadays than you used to. If you want or need something you either pay the price or you do without. Shaming people for paying what they have to, doesn't make much sense. I just used cars as an example but you could apply the same principle to anything.
 
Over the next 5 years we will see how nGreedia makes a xx50 class card for $1000... Oh wait... That's the 5080.
 
The downside of the incoming 5060Ti, as well as the 16Gb 4060Ti, and even the RX 7600XT isn't that they are 128-bit, 8-lane PCIe cards that *might* not be able to effectively leverage their 16Gb of VRAM, but rather, simply their price. It seems that some people view the VRAM amount as somehow being a negative for a reason other than the increased cost. As if these cards are 'unbalanced' in some way by that.

As I see it, it isn't a downside that they have 16Gb, but rather a benefit that they have more than 8. They're just too expensive.

Doesn't the large L2 cache significantly mitigate the lack of headline bandwidth on the 4060 cards? Especially the 16GB. I dunno, I think people get too hung up on paper specs rather than looking at how it actually performs. The bandwidth is poor on paper but in the real world doesn't seem to hold it back much. The 5060 cards have much higher bandwidth but I am guessing this won't actually translate into significantly better performance.
 
The 5060 ti 8GB will be the next queen of gaming. When neural texture compression feature gets enabled in games in a few months, together with multi frame generation 4x, it will be like having a $1600 4090 for $400.

Which means the 4000 series cards market value will fall down heavily, and their owners will fight back in forums saying this is not true, alongside people with short sighted vision, same as when serie 4xxx was released with frame generation. History will repeat itself once again.
You can't use multi frame generation 4x with sub-60 framerates (well, technically you can, but it's crap, not even nvidia recommends that). It cannot be pressed into the service you're hoping for, for making entry level GPUs adequately run highly demanding titles.
 
Back
Top