• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5060 Ti 16 GB SKU Likely Launching at $499, According to Supply Chain Leak

Nvidia are selling some heavily cut down dies (aka lower tier GPUs) compared to GTX generations, but selling them for a lot more money than they should be... That's a fact. The 5080 having a GB203 chip with a 256-bit bus is an absolute joke for a GPU sold for $1000. The same goes does lower models. We definitely need more competition! We all love Nvidia GPUs but Nvidia is all about AI and making huge margins, they don't care about Gamers anymore.
NVidia is selling what the market is buying. Full stop. When the entire top 10 on Steam HW chart is just NV GPUs you know the market has spoken. We’ll see if RDNA 4 will ameliorate the situation a bit, but probably not. Oh, and NV hasn’t cared about the “Gamers” in a long time. The writing was on the wall 15 years ago. I have no idea why some people are coming to this realization just now.

I really hope AMD & Intel will gain some really good market share in a near future! I cannot wait to see that AI bubble burst and see how Nvidia will rush back to gamers to save their ass...
I too enjoy making up fantastical scenarios that have almost no reasonable chance to come to pass. Even if the AI bubble bursts, NV had massive HPC contracts before it even started. They aren’t going to need the gaming market to “save” them since they ALREADY own that market. They have around 90% of it. It’s fully tapped. If ever, for whatever reason, AMD gets going in a way NV actually would care to address all they would need to do is get a relatively small price cut and back to irrelevance Radeon goes. Intel is completely pointless to even discuss since I have doubts the GPU division will even survive in the next 5 years.
 
Nvidia are selling some heavily cut down dies (aka lower tier GPUs) compared to GTX generations, but selling them for a lot more money than they should be... That's a fact.
No, it’s entirely your opinion.
 
Like... I honestly get it. I don't like the super cutdown HW too. But if they were only selling full dies, prices would be even worse :banghead:
 
Nvidia are selling some heavily cut down dies (aka lower tier GPUs) compared to GTX generations, but selling them for a lot more money than they should be... That's a fact. The 5080 having a GB203 chip with a 256-bit bus is an absolute joke for a GPU sold for $1000. The same goes does lower models. We definitely need more competition! We all love Nvidia GPUs but Nvidia is all about AI and making huge margins, they don't care about Gamers anymore.

I really hope AMD & Intel will gain some really good market share in a near future! I cannot wait to see that AI bubble burst and see how Nvidia will rush back to gamers to save their ass...

Not really, at least percentage wise relative previous GTX generations. I already listed every 70/80 series per mm² die since GTX600 on a previous page.

The bigger concern you have? Lower end GPU's like the x60/x70 tiers used to buy the consumer significantly more than what is currently being offered and the generational gains aren't as big outside of making the 5090 a monster 750mm² behemoth.. 5090 isn't even fully enabled either.

Every other GPU below it looks much worse when counting CUDA cores relative what GB202 is.

GTX1080 @ $599 MSRP adjusted for inflation is basically $800 bucks and thats only 314 mm² die (full enabled) on a much much cheaper TSMC 16nm node. No point mentioning GTX 600-900 since they're all 28nm.. which was even cheaper per wafer lol.

Pricing arguably got wonky on with the RTX 20 series. Overly large 12nm dies. Even RTX2060 was a cutdown 445mm² TU106.

RTX 30 series allowed a more traditional "28nm type" kind of run on Samsung 8nm. 3070 (GA104) went back to the high side of being closer to 400mm² *2SM disabled*. 3080 (GA102) deleted 16 SM from the flagship die @ 628 mm². 80% enabled.. Same as the GTX780 via 12/15 SM.

5080 is actually on the bigger side of x80 releases @ 378mm² on a current N4 node (It's full die with 84 SM).

Edit:

Overpriced? Might be debatable.. at least "MSRP". the market price does suck though, esp for AIB.

680? 300mm2, full die. 500 usd in 2012
780? 550 mm2, cut down flagship die, but price increase from 500>650 usd (obv) in 2013
980? 400mm2, full die, last 28nm release and yield was high as hell. 550 usd in 2014
1080? 300mm2 full die 600 usd in 2016
2080? 550mm2, cut down 700 usd in 2018
3080? 630mm2, cut down from flagship die, 700 usd in late 2020, Samsung 8 (which was prob due to cost reasons and lack of TSMC availability)
4080? 380mm2 cutdown and 1200 usd launch.. (Okay ill admit, this one WAS overpriced).

The truth is.. NVIDIA was always playing these games.. The general perf gap per generation isn't as big anymore. GTX 680 > 980 improvements were literally INSANE on the same 28nm node.

My biggest gripe now is the fact that they're obviously trying to manipulate memory controller config due to the fact that 3/4GB dies are right around the corner. Don't be surprised to see 18GB 5070s to compete against 9070/XT's current dominance.
 
Last edited:
NVidia is selling what the market is buying. Full stop. When the entire top 10 on Steam HW chart is just NV GPUs you know the market has spoken. We’ll see if RDNA 4 will ameliorate the situation a bit, but probably not. Oh, and NV hasn’t cared about the “Gamers” in a long time. The writing was on the wall 15 years ago. I have no idea why some people are coming to this realization just now.


I too enjoy making up fantastical scenarios that have almost no reasonable chance to come to pass. Even if the AI bubble bursts, NV had massive HPC contracts before it even started. They aren’t going to need the gaming market to “save” them since they ALREADY own that market. They have around 90% of it. It’s fully tapped. If ever, for whatever reason, AMD gets going in a way NV actually would care to address all they would need to do is get a relatively small price cut and back to irrelevance Radeon goes. Intel is completely pointless to even discuss since I have doubts the GPU division will even survive in the next 5 years.

A lot of that is due to AMD having bad drivers in the past and also making bad decisions... Also Nvidia used to make much more Gaming & Performance oriented GPUs back then. But we need more competition now otherwise Gaming as we know it is going to die. Most of those companies just want Gaming to be made via Streaming just like Netflix, look at how Xbox Game Pass and PS NOW, it's just the beginning untill we all have to pay a Subscription... Let's prey Valve & Steam will stay the way they are.

Nvidia are dominating AI now but it doesn't mean that they still will in the future. I'm sure more and more companies will compete with them and that Nvidia will be beaten sooner or later. Who thought TSMC would be in that situation in 2015 ? Nobody. Same as AMD beating Intel after a whole decade of complete domination...

No, it’s entirely your opinion.
My opinion? Lol...
1000073568.jpg
1000073569.jpg


Not really, at least percentage wise relative previous GTX generations. I already listed every 70/80 series per mm² die since GTX600 on a previous page.

The bigger concern you have? Lower end GPU's like the x60/x70 tiers used to buy the consumer significantly more than what is currently being offered and the generational gains aren't as big outside of making the 5090 a monster 750mm² behemoth.. 5090 isn't even fully enabled either.

Every other GPU below it looks much worse when counting CUDA cores relative what GB202 is.

GTX1080 @ $599 MSRP adjusted for inflation is basically $800 bucks and thats only 314 mm² die (full enabled) on a much much cheaper TSMC 16nm node. No point mentioning GTX 600-900 since they're all 28nm.. which was even cheaper per wafer lol.

Pricing arguably got wonky on with the RTX 20 series. Overly large 12nm dies. Even RTX2060 was a cutdown 445mm² TU106.

RTX 30 series allowed a more traditional "28nm type" kind of run on Samsung 8nm. 3070 (GA104) went back to the high side of being closer to 400mm² *2SM disabled*. 3080 (GA102) deleted 16 SM from the flagship die @ 628 mm². 80% enabled.. Same as the GTX780 via 12/15 SM.

5080 is actually on the bigger side of x80 releases @ 378mm² on a current N4 node (It's full die with 84 SM).

Edit:

Overpriced? Might be debatable.. at least "MSRP". the market price does suck though, esp for AIB.

680? 300mm2, full die. 500 usd in 2012
780? 550 mm2, cut down flagship die, but price increase from 500>650 usd (obv) in 2013
980? 400mm2, full die, last 28nm release and yield was high as hell. 550 usd in 2014
1080? 300mm2 full die 600 usd in 2016
2080? 550mm2, cut down 700 usd in 2018
3080? 630mm2, cut down from flagship die, 700 usd in late 2020, Samsung 8 (which was prob due to cost reasons and lack of TSMC availability)
4080? 380mm2 cutdown and 1200 usd launch.. (Okay ill admit, this one WAS overpriced).

The truth is.. NVIDIA was always playing these games.. The general perf gap per generation isn't as big anymore. GTX 680 > 980 improvements were literally INSANE on the same 28nm node.

My biggest gripe now is the fact that they're obviously trying to manipulate memory controller config due to the fact that 3/4GB dies are right around the corner. Don't be surprised to see 18GB 5070s to compete against 9070/XT's current dominance.

GPUs are a lot more expensive than they used to be when based on a Performance/Price ratio at a given time (given the year and performance available at that time). Look at the graphs I just posted...
 
Last edited:
My opinion? Lol...
1000073568.jpg
1000073569.jpg
This was already addressed by several people, me included. Steve is working from a flawed position here by default. The 4090 and 5090 are completely warping the entire comparison since they are, essentially, unprecedented compared to any consumer chip NV has released before in terms of size and resources. They make everything else look more “cut down” even it isn’t.

But even if we pretend that the GN argument has merit (it does in a vacuum, just not in the way they chose to present it) - so what? People are still buying the cards, don’t they? I can assure you, only the enthusiast minority on forums like these even cares about chips in the cards and how much they are or are not cut down. The layperson just knows that he has a “insert budget here” and there is an NV GPU for roughly that amount of cash being sold. He also “knows” that NV makes the only video cards worth getting. He “knows” that Radeon is just a worse NV with bad drivers that mostly weirdos buy and he probably has never even heard that Intel makes GPUs since Intel has been about as bad as Radeon at the whole “capture mind-share” thing. So he goes out (or logs onto the online store of choice) and buys the 4060 or the 4060Ti or the 4070. And no, he doesn’t care that psychotic enthusiasts have declared those cards unusable dogshit with bad amount of VRAM, pathetic memory buses and underwhelming chips. The layperson, after all, just wants to play games and the card seemingly does that just fine. So he is content and the next time he will, once again, buy an NV card. And this is the reason, kids, why seething about “NV bad” and constructing elaborate arguments explaining WHY is completely pointless, a waste of time and will not turn the dial on the market at all. Thanks for coming to my TED Talk.
 
It never got that cheap here, that's £187, £250 maybe but £280-£300 was more typical, same as the RTX 4060. Prices haven't moved much for years at this tier, in real terms. I am struggling to be outraged.

This tier of GPU gives you similar to PS5 performance and you can sell it after a few years and get most of your money back if you want.
It was selling for around 250 euros here and where occasions in sales periods that was dropping at 220 euros. I would have had that card in my specs if I haven't bought the RX 6600 for about 180 euros, 15 days before I saw that sale. But it's a fact that Nvidia was offering a really good card with plenty of VRAM, until recently. Today they offer somewhat better performance with only 8GB of VRAM at a higher price point, meaning anyone wanting a sub 300 euros Nvidia card that will last, they can't find any.
AMD did the same. They had great options with the RX 6000 series in the form of 6700 and 6750, cards that could be found and bought at around 280-350 euros a year or two ago. AMD also haven't replace those cards with models of equivalent performance or VRAM. The only options is the 7600 with 8GB of VRAM with performance like a 6600XT and over 300 euros the 16GB VRAM model with more VRAM than the 6700 and 6750, but less performance.
The only one who can save that part of the market is Intel, but they really need to fix their manufacturing. Only then they will have the means to flood the market with sub $300 cards that will be offering good performance and enough VRAM, while they will be enjoying good prrofit margins from those cards. Relying on TSMC is something that probably limits them.

And now we are in a new phase of the market, where prices are artificialy inflated much higher than MSRP. So, why would the consumer defend those practices? Because of "wants" that are translated as "needs".
"The X guy who lauphs at me on the Internet just bought a new RTX 5070/RX 9070 at MSRP because he was lucky. I want need to buy an RTX 5070 Ti/RX 9070XT at any price to beat him".
or
"That game that I rarely play needs more GPU power to put that one last setting at Ultra. I want need to switch that setting at Ultra, or I can't enjoy the game"
or
"I am only able to produce 1 image per minute of nude elf girls with this GPU. I want need to buy a faster one to produce more nude elf girl images"
or
"This GPU is slow in transcoding and Blender. I want need to buy a faster one in case I decide to transcode something, or make an animated video".
 
"I am only able to produce 1 image per minute of nude elf girls with this GPU. I want need to buy a faster one to produce more nude elf girl images"
or
"This GPU is slow in transcoding and Blender. I want need to buy a faster one in case I decide to transcode something, or make an animated video".
These two can actually be unironic needs if that’s how one makes his daily bread. Yes, first one included. No judgement from me, one does what he has to get the bag.
 
I hate to break it to you, but a lot of people can; spending $1500 on a video card is not so difficult compared to $1000.

The price increase is called the law of opportunity cost. You want the best? You're stuck with the price tag. And your argument is "I feel stupid". I'm not sure it's just a feeling.
I could say many things about your arguments, even this so called argument that you post here. But I think I will really feel stupid if I continue this conversation with someone like you.
 
And now we are in a new phase of the market, where prices are artificialy inflated much higher than MSRP. So, why would the consumer defend those practices? Because of "wants" that are translated as "needs".
"The X guy who lauphs at me on the Internet just bought a new RTX 5070/RX 9070 at MSRP because he was lucky. I want need to buy an RTX 5070 Ti/RX 9070XT at any price to beat him".
or
"That game that I rarely play needs more GPU power to put that one last setting at Ultra. I want need to switch that setting at Ultra, or I can't enjoy the game"
or
"I am only able to produce 1 image per minute of nude elf girls with this GPU. I want need to buy a faster one to produce more nude elf girl images"
or
"This GPU is slow in transcoding and Blender. I want need to buy a faster one in case I decide to transcode something, or make an animated video".

None of this applies to the subject of the thread, the 60 series which do sell at or near MSRP. Other than that it seems you've invented a lot of straw men to get angry about.

FWIW I do use nvenc to encode videos regularly when I need speed, otherwise I'll do it on the CPU if I have time and need/want the quality. Another benefit of nvidia cards specifically over the competition unless they improved and I never heard about it, is the way you can record full quality gameplay in real time on nvidia cards easily using their own software without any performance penalty (or neglible).

I imagine most people buying 60 series cards game at 1080p (like me) or 1440p and have no illusions about expecting to run on max details, if we can then it's a bonus, and I take it as a given I'll have to turn some things down, usually an insignificant sacrifice. You can't go much lower than the 60 series if you want a very good gaming experience, so shaming people for buying this particular tier of card makes no sense.

The poster above is right too, nobody normal cares about abstract things like bus width, it doesn't matter. It's a tool, you stick it in your PC, screw the case cover closed and you never think about it again.
 
I guess people would rather go to every length with excuses to buy an overpriced GPU, than accept their favorite graphics card brand is screwing them over with shrinkflation.
I'm not surprised at the excuses because it happened with the 4000 series, but the justifications are getting really ridiculous with this 5000 series, especially with all the hyperbole and attacking the messenger, instead of realizing an upgrade isn't necessary. People need to vote with their wallets.
Yeah. Until recently we had the "It offers DLSS and CUDA, so it's normal to have a higher MSRP" and now we gone to the "I don't care about MSRP, I will defend ANY price, because ANY price is justified".

Fun fact about that upgrade you write and it's necessity. Imagine how many in here are describing DLSS as better than native and they are choosing to upgrade a perfectly capable and fast card, just so they can buy a faster card to have enough compute power to disable DLSS completely.


Anyway it's really simple. Anyone bought an Nvidia card at MSRP, or at a somewhat not ridiculous price, sees it's value skyrocketing, even at the second hand market. If they happen to have a couple more older Nvidia or even AMD cards around, they will see the value of those cards also going up. So, if this trend continues they will be able to use their new card for a year or two and then sell it at a price close to the price they bought it. You can't argue with someone who sees opportunity behind these price increases. He will never agree with your arguments.

None of this applies to the subject of the thread, the 60 series which do sell at or near MSRP. Other than that it seems you've invented a lot of straw men to get angry about.

FWIW I do use nvenc to encode videos regularly when I need speed, otherwise I'll do it on the CPU if I have time and need/want the quality. Another benefit of nvidia cards specifically over the competition unless they improved and I never heard about it, is the way you can record full quality gameplay in real time on nvidia cards easily using their own software without any performance penalty (or neglible).
ALL of the above apply to ANY card, not just the 60 series.

Then you say that I invent straw men and somehow I am angry, only to follow up with pointing at what you do, like these are selling points for anyone buying a card FOR GAMING (not shouting, just reminding the primary function of these cards). I like how you imply that nvenc produces worst results. So, how is this a selling point? Record gameplay. For what reason? Not anyone has a channel online posting gaming footage.

I imagine most people buying 60 series cards game at 1080p (like me) or 1440p and have no illusions about expecting to run on max details, if we can then it's a bonus, and I take it as a given I'll have to turn some things down, usually an insignificant sacrifice. You can't go much lower than the 60 series if you want a very good gaming experience, so shaming people for buying this particular tier of card makes no sense.

The poster above is right too, nobody normal cares about abstract things like bus width, it doesn't matter. It's a tool, you stick it in your PC, screw the case cover closed and you never think about it again.
No, marketing makes people to expect that they can try higher resolutions. And it's shortsighted to only look at new games that ask for example for an RTX 4060 as minimum for 60fps at 1080p. There are plenty of games that need less compute power to have meaningful framerates at 4K, or even older AAA games that can run with an RTX 4060 at 4K all setting at ultra. Not everyone is rushing to buy the best card to run the new games at day one with full quality. People who do that, probably are those who are willing to pay any price. And I am not shaming people who buy "this particular tier". My God, my main GPU is an RX 6600. If you can't find arguments don't try to create ones by distorting what I wrote.

Here, let me remind you what I posted.
When your customers are stupid and willing to pay 50-100% inflated prices over a fake MSRP, the most stupid thing to do is, be honest and not take advantage of that stupidity to transform it to higher profit margins.
When people are willing to pay 50% more over MSRP, that's what happens. I say that 8-10 years ago everyone could buy the equivalent GPU for half the price and people end up distorting what I wrote, to make me look bad. Because in the end, when all other arguments fail, the only thing to do is obviously to make the other person look bad. Right?

And if you don't care about the bus, Nvidia does. Because a GPU with an 128bit data bus is cheaper to produce and also has less memory bandwidth, meaning higher profit margin for the manufacturer, less future proof for the consumer. So, yeah, ignore it. It doesn't matter anyway. We have neural compression now. Hurray! 96bit data bus and 6 or 12GB VRAM will be enough even in 2030 for the sub $500 category.
 
Last edited:
I like how you imply that nvenc produces worst results. So, how is this a selling point?
It produces quality nearly as good as CPU encoding but much faster, the benefits of this should be obvious enough. Sometimes you want it done fast and if you're just going to watch it once yourself then it's good enough.

Record gameplay. For what reason? Not anyone has a channel online posting gaming footage.
A lot of people do put their gaming videos online, you don't have to be a big name YouTuber to do it. It's another feature nvidia has which gives their cards added value.

And if you don't care about the bus, Nvidia does. Because a GPU with an 128bit data bus is cheaper to produce and also has less memory bandwidth, meaning higher profit margin for the manufacturer, less future proof for the consumer. So, yeah, ignore it. It doesn't matter anyway. We have neural compression now. Hurray! 96bit data bus and 6 or 12GB VRAM will be enough even in 2030 for the sub $500 category.
I care about end results - if the 128-bit bus and 288GB/s of bandwidth gets good results then that's all that matters. It beats cards with a 256-bit bus and 50% higher bandwidth from recent years so nvidia is obviously doing something right. (edit - the 5060 cards will likely demonstrate that bandwidth doesn't make a lot of difference beyond a certain level. They are 128-bit as well but their higher GDDR7 memory speed will be somewhat equivalent to if they used GDDR6 like the 4060 but with a wider bus. So while the bandwidth may increase by what, 50%? I am guessing performance will be more like 10%-20% higher, some of that due to higher clocks on the newer cards too.)

This argument is like getting angry at cars now having smaller engines and turbos, how outrageous, we used to have big engines, now we only have small ones, they can't keep getting away with this! But how do those cars perform? People getting hung up on paper specs and ignoring performance are not seeing the wood for the trees.
 
Last edited by a moderator:
These two can actually be unironic needs if that’s how one makes his daily bread. Yes, first one included. No judgement from me, one does what he has to get the bag.
We are talking about gaming cards. People who transcode videos, create pictures of elf girls, upload gaming footage online, run Blender to create realistic images as a job, as a way to make money, are probably, what..... 0.1%?
 
We are talking about gaming cards. People who transcode videos, create pictures of elf girls, upload gaming footage online, run Blender to create realistic images as a job, as a way to make money, are probably, what..... 0.1%?
More than you think, actually. And they aren’t really “gaming” cards as such. They are CONSUMER cards. What one does with them is up to the user. Since the advent of GPGPUs there isn’t really such a thing as a “gaming” card or, as they used to be called, graphics accelerators. When you stop viewing them as just toys for playing games things start making significantly more sense. Again, this isn’t a new thing.
 
It produces quality nearly as good as CPU encoding but much faster, the benefits of this should be obvious enough. Sometimes you want it done fast and if you're just going to watch it once yourself then it's good enough.
If you are going to watch it ONCE, you just watch the original. If you are going to store it for a future watch, you go with the best quality option.

A lot of people do put their gaming videos online, you don't have to be a big name YouTuber to do it. It's another feature nvidia has which gives their cards added value.
A lot of people, like how many? 10? 100? 1000? 10000? MILLIONS of GPUs are sold every quarter.
So, added value for those 10, 100, 1000, 10000. Useless feature for the MILLIONS who don't. MILLIONS who fell victims of marketing.
Let's ALL buy tractors to drive in the city, because they are so useful to so many farmers. Thanks to tractors farmers make money, so this is a very strong feature. We should all buy tractors to drive in the city, because tractors are an investment. If tractors seem a bad argument, how about minivans?

I care about end results - if the 128-bit bus and 288GB/s of bandwidth gets good results then that's all that matters. It beats cards with a 256-bit bus and 50% higher bandwidth from recent years so nvidia is obviously doing something right.
The 128bit data bus limits the options to 8GB or 16GB of VRAM. It's not just the bandwidth it's a way to keep selling 8GB VRAM cards while charging much more for the 16GB model. The limited bandwidth on the other hand, limits the card's ability to use all that extra memory to offer better performance at higher settings or resolutions. You end up saying "good results", because you haven't seen what that card could do with more bandwidth.

More than you think, actually. And they aren’t really “gaming” cards as such. They are CONSUMER cards. What one does with them is up to the user. Since the advent of GPGPUs there isn’t really such a thing as a “gaming” card or, as they used to be called, graphics accelerators. When you stop viewing them as just toys for playing games things start making significantly more sense. Again, this isn’t a new thing.
I doubt they are many. I doubt they even amount to 1% if 0.1% seems small.
And it's not how I see these cards. It's their main purpose to accelerate graphics in games. It's how the almost absolute majority sees these cards. Every other function is useful for some, but very few compared to how people use these cards and they use them purely as gaming cards. You know, I have read many times the phrase "I have had enough with prices/availability/whatever of gaming cards. I am going to buy a console". I have never read something different. People saying they are going to XBOX or PS, don't mean that they are trying to find alternatives for streaming, or Blender, or something else. They are talking about gaming. We are not talking about smartphones here, where calling someone has always been the main function, but not the reason everyone is looking at a smartphone screen today.
 
Last edited:
If you are going to watch it ONCE, you just watch the original. If you are going to store it for a future watch, you go with the best quality option.
Do you think if 'just watch the original' was an option then I would be encoding it another way? There are two reasons I use nvenc, to make 'the original' or to convert something which is in the incorrect format to directly play on my television - so I quickly use handbrake to make a compatible file.

The 128bit data bus limits the options to 8GB or 16GB of VRAM. It's not just the bandwidth it's a way to keep selling 8GB VRAM cards while charging much more for the 16GB model. The limited bandwidth on the other hand, limits the card's ability to use all that extra memory to offer better performance at higher settings or resolutions. You end up saying "good results", because you haven't seen what that card could do with more bandwidth.
We'll have the answer to that when the 5060 Ti is reviewed. It has 55% higher bandwidth than the 4060 Ti, how much do you think that will translate into higher performance? Same bandwidth as the 3060 Ti actually, which the 4060 Ti already beats.
 
This was already addressed by several people, me included. Steve is working from a flawed position here by default. The 4090 and 5090 are completely warping the entire comparison since they are, essentially, unprecedented compared to any consumer chip NV has released before in terms of size and resources. They make everything else look more “cut down” even it isn’t.

But even if we pretend that the GN argument has merit (it does in a vacuum, just not in the way they chose to present it) - so what? People are still buying the cards, don’t they? I can assure you, only the enthusiast minority on forums like these even cares about chips in the cards and how much they are or are not cut down. The layperson just knows that he has a “insert budget here” and there is an NV GPU for roughly that amount of cash being sold. He also “knows” that NV makes the only video cards worth getting. He “knows” that Radeon is just a worse NV with bad drivers that mostly weirdos buy and he probably has never even heard that Intel makes GPUs since Intel has been about as bad as Radeon at the whole “capture mind-share” thing. So he goes out (or logs onto the online store of choice) and buys the 4060 or the 4060Ti or the 4070. And no, he doesn’t care that psychotic enthusiasts have declared those cards unusable dogshit with bad amount of VRAM, pathetic memory buses and underwhelming chips. The layperson, after all, just wants to play games and the card seemingly does that just fine. So he is content and the next time he will, once again, buy an NV card. And this is the reason, kids, why seething about “NV bad” and constructing elaborate arguments explaining WHY is completely pointless, a waste of time and will not turn the dial on the market at all. Thanks for coming to my TED Talk.

The reason why die size have increased a lot compared to GTX GPUs is because back in the days there were no RT nor Tensors cores... and because Nvidia have not improved their Rasterization IPC since Ampere! (Ampere, Lovelace and Blackwell have pretty much the same IPC) ! The real difference between them is the amount of CUDA Cores, Memory type and Memory Bus size.
The reason people are buying x90 GPUs is because they are the best on the market, period. I have a 4K QD-OLED 240Hz monitor so I want to play at 4K as much as possible, and I'm not spending $1000+ on a GPU that only has 16GB VRAM...
And yes the CUDA Core counts do matter because the GTX 1080 Ti had 93% of a full GP103 die but was sold for $700 whereas the 4090 & 5090 have 89% of a full die but cost $1600 & $2000 (MSRP). Therefore the GTX 1080 Ti had a much better value than today's GPUs (even when adding inflation).
 
There are two reasons I use nvenc, to make 'the original' or to convert something which is in the incorrect format to directly play on my television - so I quickly use handbrake to make a compatible file.
In both cases you want to go with the best quality route, not the fastest transcoding option. Especially if something IS the original, you put quality settings at maximum. Storage we have plenty today.

We'll have the answer to that when the 5060 Ti is reviewed. It has 55% higher bandwidth than the 4060 Ti, how much do you think that will translate into higher performance? Same bandwidth as the 3060 Ti actually, which the 4060 Ti already beats.
Well, I guess if the 4060 Ti is a 1080p card, the 5060 Ti will be 1440p card.

3060 Ti comes with 4864 CUDA cores at 1665MHz Boost clock.
4060 Ti comes with 4352 CUDA cores (of newer architecture, so probably higher IPC) at 2535MHz boost clock.

So, if we just look at those numbers, a 4060 Ti should be about 35-40% faster. Instead it is 11% faster compared to the 3060 Ti in the TPU database. And that's because 4060 Ti doesn't just have less memory bandwidth, but also lower number of ROPs, Tensor cores and RT cores. Nvidia took whatever advantage had by the newer architecture and the higher frequency and negated with cuts in other parts of the chip and the board to make it sure that the 4060 Ti wouldn't be offering much higher performance, because then the 5060 Ti would need to also offer much better performance and if we start offering 30-50% higher performance in every new x060 card, how are we going to sell those x070 and x080 and x090?
 
In both cases you want to go with the best quality route, not the fastest transcoding option. Especially if something IS the original, you put quality settings at maximum.
How about you let me decide what I need rather than second guessing. If I can quickly convert something I'm only going to watch once, in two minutes at 95% quality or wait an hour for 100% quality I'll take the former. As for 'the original' I mean if I am making / editing a video I will use the quick method to get everything correct, and only after that will I do the final encode on the CPU.

Well, I guess if the 4060 Ti is a 1080p card, the 5060 Ti will be 1440p card.

3060 Ti comes with 4864 CUDA cores at 1665MHz Boost clock.
4060 Ti comes with 4352 CUDA cores (of newer architecture, so probably higher IPC) at 2535MHz boost clock.

So, if we just look at those numbers, a 4060 Ti should be about 35-40% faster. Instead it is 11% faster compared to the 3060 Ti in the TPU database. And that's because 4060 Ti doesn't just have less memory bandwidth, but also lower number of ROPs, Tensor cores and RT cores. Nvidia took whatever advantage had by the newer architecture and the higher frequency and negated with cuts in other parts of the chip and the board to make it sure that the 4060 Ti wouldn't be offering much higher performance, because then the 5060 Ti would need to also offer much better performance and if we start offering 30-50% higher performance in every new x060 card, how are we going to sell those x070 and x080 and x090?
Nobody denies or doubts that more bandwidth is better, just that the downsides of 128-bit for this class are exaggerated.

I find it interesting that PC enthusiasts feel entitled to ever faster performance at the same or less cost. Why do people think they have a right to that? The fact GPUs keep improving at all in terms of efficiency after decades is fantastic but not something we should take for granted. Instead of enjoying and being grateful, some prefer to always demand more. 20% faster than previous gen? Not good enough, how dare nvidia not offer me more, for less money too. It is my right! No it isn't.
 
How about you let me decide what I need rather than second guessing. If I can quickly convert something I'm only going to watch once, in two minutes at 95% quality or wait an hour for 100% quality I'll take the former. As for 'the original' I mean if I am making / editing a video I will use the quick method to get everything correct, and only after that will I do the final encode on the CPU.
Well, you are not saying here something different than what i wrote, so why the rage?
Nobody denies or doubts that more bandwidth is better, just that the downsides of 128-bit for this class are exaggerated.
Show me numbers that prove it. Show me numbers from a 4060 with double the bandwidth. At least I show you some theoretical numbers that indicate a 20-30% performance loss. Do you have any number to show?
I find it interesting that PC enthusiasts feel entitled to ever faster performance at the same or less cost. Why do people think they have a right to that? The fact GPUs keep improving at all in terms of efficiency after decades is fantastic but not something we should take for granted. Instead of enjoying and being grateful, some prefer to always demand more. 20% faster than previous gen? Not good enough, how dare nvidia not offer me more, for less money too. It is my right! No it isn't.
It's not interesting, it's a necessity. The alternative is stagnation and prices going up.

If you are a shareholder of Nvidia, AMD, Intel, Qualcomm, you would want the same performance at the same price, with probably a feature or two as the carrot to make consumers buy. In that case, yes it is interesting.
If you are looking at this market as a consumer, you shouldn't be seeing it as something interesting, but as a necessity that will make it possible for you to be getting better experience from every new product you buy, while not having to start thinking of braking the bank.
 
The xx60 shapes to be an ever going side-grade.
Constant performance where the only chance is the dlss support.

Monopoly is such fun
 
It's not interesting, it's a necessity. The alternative is stagnation and prices going up.

If you are a shareholder of Nvidia, AMD, Intel, Qualcomm, you would want the same performance at the same price, with probably a feature or two as the carrot to make consumers buy. In that case, yes it is interesting.
If you are looking at this market as a consumer, you shouldn't be seeing it as something interesting, but as a necessity that will make it possible for you to be getting better experience from every new product you buy, while not having to start thinking of braking the bank.
Efficiency for other products doesn't rise by large percentages every few years, or at all in many cases, so why should electronics be any different? Eventually you have made it as efficient as you can. Do you think CPUs and GPUs will keep getting faster and/or more efficient forever? Ignoring all physical limits? It isn't realistic. It isn't possible. We will hit a wall, the only question is when. You cannot always just invent something faster. It's why we still use essentially the same jet engine technology as the 1940s, modern engines are somewhat more efficient but there has never been any quantum leap forward since then because it's the limit of the technology. We still fly across the Atlantic in the same time it took 60 years ago. Eventually you come up against the laws of physics.

Imagine buying a new car from Ford every two years and demanding a 20% increase in performance and fuel economy each time. They would think you were insane. First they would laugh, then they'd tell you to go away.
 
"I am only able to produce 1 image per minute of nude elf girls with this GPU. I want need to buy a faster one to produce more nude elf girl images"
Okay but this is indeed a legitimate need actually, it's pretty vital to my overall health.
We are talking about gaming cards. People who transcode videos, create pictures of elf girls, upload gaming footage online, run Blender to create realistic images as a job, as a way to make money, are probably, what..... 0.1%?
Who do you think is causing the GPU shortages?
 
Efficiency for other products doesn't rise by large percentages every few years, or at all in many cases, so why should electronics be any different? Eventually you have made it as efficient as you can. Do you think CPUs and GPUs will keep getting faster and/or more efficient forever? Ignoring all physical limits? It isn't realistic. It isn't possible. We will hit a wall, the only question is when. You cannot always just invent something faster. It's why we still use essentially the same jet engine technology as the 1940s, modern engines are somewhat more efficient but there has never been any quantum leap forward since then because it's the limit of the technology. We still fly across the Atlantic in the same time it took 60 years ago. Eventually you come up against the laws of physics.
There are two performance components in a modern system. The CPU and the GPU. The CPUs are becoming more and more powerful every few years, with better efficiency, higher performance, more cores in almost every category. GPUs are not. Why? Because the CPU market is more balanced the last years. Intel enjoys capacity and OEM ties, AMD enjoys better efficiency and performance.
In GPUs that have become a monopoly, we are celebrating the RX 9070 series as a beacon of life. In fact in a healthy market, the RX 9070 series would had prices of $450-$550 MSRP and no custom model would be going more than $50 over MSRP. RTX 5070 would have been a $450-$500 model with REAL street prices, RTX 5070 Ti wouldn't had a price tag of more than $650 and the top RTX 5090 model would have been around $1200 max.
But nevermind, let's find excuses to justify the current market reality, because it's so wonderful
for companies and their shareholders.

Imagine buying a new car from Ford every two years and demanding a 20% increase in performance and fuel economy each time. They would think you were insane. First they would laugh, then they'd tell you to go away.
Imagine a new car from Ford performing the same as a 20 years old model in performance and fuel economy, while costing more.
 
I find it interesting that PC enthusiasts feel entitled to ever faster performance at the same or less cost. Why do people think they have a right to that? The fact GPUs keep improving at all in terms of efficiency after decades is fantastic but not something we should take for granted. Instead of enjoying and being grateful, some prefer to always demand more. 20% faster than previous gen? Not good enough, how dare nvidia not offer me more, for less money too. It is my right! No it isn't.

Efficiency with 600W GPUs and melting connectors?! lol sure :roll:
20% gen over gen after 2 years is ridiculous. Nvidia used to release new GPUs every year with 20-30% more performance, but now it’s every 2 years and except the x90 the rest is barely improving... So yes Gamers who have been buying and using Nvidia GPUs for so many years kinda have theird word to say. Without Gamers Nvidia would have never even existed!
 
Back
Top