• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5050 Reportedly Scheduled for July Release

And exactly what has happened to the speed of GDDR7 with nGreedia? It shows no performance improvement what so ever, especially on the mid to lower end. The 9060XT is mostly faster than the 5060, despite the 5060 using GDDR6. tldr - It does not show in the real world.

That is strange, even the 5070TI vs 9060XT the GDDR7 vs GDDR6 don't seem to make much of a difference. I suspect the Nvidia cards are bottleneck by the fewer ROPs.
 
What would you say is a fair price for a 5060 & 5060?
Not really interested in nvidia cards outside of production, so the 5060 isn't really appealing to me.
GB206 is a really solid 1080p card and is effectively a 1080Ti replacement with less memory.
1749204676505.png


What I'm saying is an 11GB 1080Ti does the same work as my 2GB P620.
Soooooo I would only care about the one that clocks in at a lower price.
 
If this thing really sucks down 130W that is nuts. My undervolted 5070 hangs right around that level. A 50 series card shouldn't even require a power cable.
Wow, that's original.
 
No the 5060 non-Ti is the xx30 from previous generations. This is the xx10 or xx20.
Oh, more daft nonsense..

Where do people come up with crap like this? Are they literally pulling it out of their bum?

Wow, that's original.
It's also incorrect. 75W is the slot power limit, so their 130W comparison is more than a bit out of whack.

Still, they made a valid point with saying that a 5050 should be possible without a PCIe power cable.
 
Last edited:
Atleast I can say I am fighting a good fight for the small guy to have better and cheaper GPUs.

Your narrow thinking leads to, you supporting corporate greed. Must be difficult being so naive.
For the first many years of Nvidia and ATI, the desktop market was growing, nearly every family in the US had a desktop computer, most desktop users found need of a disctete GPU, the cost to develop and make a GPU was low, and many people wanted a modest discrete GPU.

Those days are over. A GPU architecture now costs billions of dollars to design, wafers go up in price with each new node, most people use cellphones instead of computers, when people do use computers they usually use laptops, and most new desktops are gaming PCs with most of the rest being creator or professional PCs. The market for low-end GPUs is almost gone, but there is a huge market for more expensive GPUs. "Fighting the good fight" for the "small guy" to have cheaper GPUs means encouraging the GPU makers to just sell any cheap GPU at all.
 
Buddy, the 5060 is only 25% faster than the 7600 (Which is basically a 6600).
Did you mean 20% instead of 200%?
No.....I was just being sarcastic. I got my RX6600 for that price, here we are 2 gens later and performance/price won't be worth an upgrade for that card I'm betting.

Even if it's only 50% faster than a 3050, it'll be a good upgrade for certain users.
hmmm, well the 4050 was about 50% over the 3050, so if you can still get a 4050 for less money than a 5050 why bother?
1749245378643.png

The 5060 only is 14% over the 4050, that's embarrassing...........meaning the 5050 could be less performance than the 4050??
1749245744663.png
 
3060 performance 2 generations later. 8 GB limit
There is no issue in the 5050 having the performance of a 3060. However, price and power consumption MUST come down.

That used to be the deal: getting the performance of the older higher tier card at a lower cost and power.
 
How much lower a video card can get until you just match the iGPU's performance?
 
hmmm, well the 4050 was about 50% over the 3050, so if you can still get a 4050 for less money than a 5050 why bother?
Because the 4050 was a laptop only GPU. There was not desktop version of it. There is a desktop version of the 3050 and there is supposed to be a desktop version of the 5050. That's why. Knowing the 5050 was expected for desktops, I've been waiting to upgrade my 1050. If the 5050 turns out not to happen then a 3050 6GB will be the future purchase.
 
Last edited:
How much lower a video card can get until you just match the iGPU's performance?
Depends on what card and what iGPU you're referring to. If you're willing to pay a lot, Ryzen AI Max+ 395 (Strix Halo)'s RX 8060S iGPU is in the ballpark of a desktop RTX 3060.
 
For the first many years of Nvidia and ATI, the desktop market was growing, nearly every family in the US had a desktop computer, most desktop users found need of a disctete GPU, the cost to develop and make a GPU was low, and many people wanted a modest discrete GPU.

Those days are over. A GPU architecture now costs billions of dollars to design, wafers go up in price with each new node, most people use cellphones instead of computers, when people do use computers they usually use laptops, and most new desktops are gaming PCs with most of the rest being creator or professional PCs. The market for low-end GPUs is almost gone, but there is a huge market for more expensive GPUs. "Fighting the good fight" for the "small guy" to have cheaper GPUs means encouraging the GPU makers to just sell any cheap GPU at all.

Aye, it cost billions and they making Trillions. They are one of the richest companies in the entire world. But you want to tell me selling GPUs for maybe 100$ less is not achievable and will make them poor? LMAO
 
when people do use computers they usually use laptops, and most new desktops are gaming PCs with most of the rest being creator or professional PCs. The market for low-end GPUs is almost gone, but there is a huge market for more expensive GPUs. "Fighting the good fight" for the "small guy" to have cheaper GPUs means encouraging the GPU makers to just sell any cheap GPU at all.
I think all of that is perfectly okay. A shrinking need for low end GPUs just means there will be a sudden explosive request for them right around the corner.
When I have a laptop or tablet, the integrated display is never high end. It isn't necessary for the work that I need out of it. Maybe an additional display out.

It's the desktop and server (retired desktop) spaces where it's usually needed and way more complicated. I know where my servers top out and that's fine.
The market for low end dGPU isn't one big slope either, it's a sine wave. My extremely long stride purchase behaviors reflect that to some degree as well.
This year I bought a fairly high end $$$ GPU on the AMD side: a 9070XT, hopefully sunsetting my worn and aging RX580 for the last time.
Not even a month later I pulled the trigger on a low $$ nvidia Quadro that is effectively the worst scraps shared with the legendary 1080Ti.
Is it still old and obsolete? Yes. Does it do the ultra specific jobs that urged me to buy it in the first place? Also yes.

So until things break in a very serious manner, my servers will go brrr.
The worst scraps of new products should be direct replacements.
When they miss that redeeming quality, NOBODY will buy them.
 
I don't know why the hate of this card, it have 8GB of memory that is plenty for low end card. The only thing that matter is the pricing.
 
Finally. With this hopefully the "the 5060 series are budget/entry-level cards" copers can shut the hell up now... Oh, who am I kidding.
Guessing this will launch at $249 to compete with B580, which is definitely due for a follow-up after its CPU-dependency debacle.
That really should've been the price of the 5060 this gen, how sad. Maybe this one will actually come out at around $200, that wouldn't be that bad.
 
Back
Top