• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5050 8 GB

For a flat 200 it would actually be a banger card for people on a budget. For 250 it’s a joke and every AIB model that’s over that is an entire circus. No, please, I am sure that this 130 watt GPU absolutely NEEDS a tri-fan solution.
Yes $200 flat would have been great.
Did the 3050 almost beat the 2060 when it came out? No this is closer to last gen than the 3050 was to the 2060
3050 at launch was like 1070 level performance were the 2060 was around a 1080. Was also a 3-year gap between the 2 which made it even worst (5050 is 2-year gap and about as fast as the 4060).
 
Last edited:
Well it runs plenty of games (Doom, Elden Ring, etc. ) above 60fps, so it's certainly not a useless GPU at 125W TDP. For older/low spec/eSports type games like CS2, Valorent, SC2, etc. it's fine too. But would I would strongly recommend people avoid and save up a bit for a better one.
 
The entire bottom end looks like it's priced to push you into the $400+ tier.

$250 for this thing is a bad joke.


Yeah it's 17% cheaper than the 4060 and slightly worse but in 2025 that is a win I guess.

I agree everything below 400 is pushing people up to the 5060ti 16GB/5070 which are also two cards that are meh AF vs what they replaced in the same price range.

I guess there is the 9060XT 16GB for those that want to make a bet AMD will support it properly for longer than 1 generation.
 
The low end is a upsell for the midrange cards, Nvidia did the same with the 40 series though. It's sad when a 3060Ti from 5 years ago still beats this card by a country mile.
And the die size, amount of PCB empty space, cooler quality, plastic backplate, this card looks like it should be $150, not $250. The amount of space on the PCB left empty is a joke though, the x50 should be a HHHL card with motherboard slot power.
 
let’s actually act like tech enthusiasts and not gibbering goblins.
I am acting like a tech enthusiast in that I am not praising a $250 card that handles 1080p "with the right settings". I felt like I was reading a review of an $80 card, back in 2010.
The thing is I do understand you and can admit calling it "RT 5030" is me throwing a cheap joke in the arena, which you call it "edgy", I call it caricaturizing.
Demanding as a consumer, prosumer, whatever is the basis of where I'm going with my narrative and the lack of competition does not justify how bad of a choice this is still, with the notion that the chip itself is badly positioned and marketed, and where you're better off with older hardware.
Also, as a tech enthusiast, I've seen over the years newer-gen beating the previous gen at the same price-point, in the same class. That was the value argument. When it did beat the next higher class, the win was for us, consumers. That would make it the go-to choice for the budget. So, again, I see nothing on this chip and product that makes it "Fine".
Give me a lecture that those are bygone days and I'll reply: What still counts is the money in my pocket and their sale. They will keep getting money and releasing bad value products while people don't demand better. The 3050 still sold, remember.
 
I see it as NVIDIA 710 or call it NVIDIA 730 refresh.

As it is a 710 refresh the price should be up to 70€. (I'm very nice, those cards were up to 40€ for a very long time)

Nvidia 700 is now officially not supported anymore by drivers. Of course you are allowed to use legacy drivers.

Playing something like a 10 year old game like the Witcher 3 makes it of course a gaming card. /sarcasm - par excellence.


The Witcher 3: Wild Hunt[c] is a 2015 action role-playing game developed and ...
 
  • Like
Reactions: Igb
Nvidia being nvidia nothing more or less from typical powerless RTX 50 Series.

It's so close to downgrade it's just crazy same gpu for same price after two years. :kookoo: Keep buying nvidia and downgrade actually may happen at some point.
 
Playing something like a 10 year old game like the Witcher 3 makes it of course a gaming card. /sarcasm - par excellence.


The Witcher 3: Wild Hunt[c] is a 2015 action role-playing game developed and ...

You do realize that Witcher received a massive graphics update, right? It's by no means the 2015 game on that front.

 
I am acting like a tech enthusiast in that I am not praising a $250 card that handles 1080p "with the right settings". I felt like I was reading a review of an $80 card, back in 2010.
My very first post in this very thread is that this card should be 200 and that 250 is a joke. The price is bad. The CARD is fine. It’s not a bad piece of technology. It’s not the 3050 which was objectively a piece of shit at any (reasonable) price.

Demanding as a consumer, prosumer, whatever is the basis of where I'm going with my narrative and the lack of competition does not justify how bad of a choice this is still, with the notion that the chip itself is badly positioned and marketed, and where you're better off with older hardware.
You don’t get to demand anything in a monopolistic market. It might be sad, but that’s how the cookie crumbles.

Give me a lecture that those are bygone days and I'll reply: What still counts is the money in my pocket and their sale. They will keep getting money and releasing bad value products while people don't demand better. The 3050 still sold, remember.
Exactly. At least this time what will be sold isn’t dreadful.

Look, @_JP_ , I actually think that we are more in an agreement than not and perhaps I overreacted to the way you made your point, for which I am sorry. But perhaps you can understand why I find such posturing tiresome when it inevitably devolves into deeply unserious nonsense like the very next post after yours:
I see it as NVIDIA 710 or call it NVIDIA 730 refresh.

As it is a 710 refresh the price should be up to 70€. (I'm very nice, those cards were up to 40€ for a very long time)
You have to admit, nobody sane actually believes that this card should or even CAN be priced at 70 dollars and it absolutely isn’t a successor to the display adapters of old. It just is not.
 
You do realize that Witcher received a massive graphics update, right? It's by no means the 2015 game on that front.


I think W1z still uses fhe DX11 varient becuase the DX12 varient crushes every cpu due to it being heavily reliant on 1 render thead.

Could be wrong though. I much prefer the DX12 version but it isn't a very good for gpu benchmark.
 
Last edited:
My very first post in this very thread is that this card should be 200 and that 250 is a joke. The price is bad. The CARD is fine. It’s not a bad piece of technology. It’s not the 3050 which was objectively a piece of shit at any (reasonable) price.


You don’t get to demand anything in a monopolistic market. It might be sad, but that’s how the cookie crumbles.


Exactly. At least this time what will be sold isn’t dreadful.

Look, @_JP_ , I actually think that we are more in an agreement than not and perhaps I overreacted to the way you made your point, for which I am sorry. But perhaps you can understand why I find such posturing tiresome when it inevitably devolves into deeply unserious nonsense like the very next post after yours:

You have to admit, nobody sane actually believes that this card should or even CAN be priced at 70 dollars and it absolutely isn’t a successor to the display adapters of old. It just is not.

100% bad due to pricing, but I also don't see any incentive for Nvidia to price it any lower when they dgaf if the diy market buys it or not it's 100% meant for prebuilts where people dont know jack about gpus.
 
@W1zzard - I honestly and genuinely dont fathom why you used up your valuable time on this card. But I guess it just confirms what all the other reviews have said, the 5050 is DOA and a waste of silicon.
 
Last edited:
edit: Very nice to make a show off card with 3 big fans and than use only 2 heatpipes. I think usual graphic cards have 6 or 8 or 12 heatpipes. Less heatpipes and less copper implies more insulating properties of the graphic card.
Even MSI nerfed it's Gaming Trio from 5 to 4 heatpipes, 5080 Ventus has 4 heatpipes too, the Plus has 6.
 
$250 is becoming interesting again. If only Intel would cut prices of B580 a bit, we'd have a true price war.


And AMD is MIA. And if not for their 9060XT 16GB they'd have nothing interesting under $400. They're caught between a rock and a hard place since they don't want to be seen as the "budget choice", but it's unlikely that a 9050 will offer much over the other choices at $250. If it did it'd be too close to 9060XT 8GB levels... So will we see a $230 or $240 part from them? Who knows.
 
I'm surprised this even gets soundly beaten by the (admittedly still very capable but aging) 3060ti.
 
Thanks for the review @W1zzard . There's a mistake in the "Pictures and Teardown" section:

Gigabyte's Gaming OC is a slightly premium version of the RTX 5050, using a dual-fan, triple-slot cooler.

As for the card itself, I wish they had sacrificed some performance for a lower power draw. The 3050 consumed significantly less power than the 3060, but this is essentially a 5060 from the power consumption point of view.

1751654329832.png
 
I was kind of expecting it to be worse tbh, I mean thats not a high bar but still.
For the right price this could be an a-okay card for the more casual use cases I guess but as usual its the pricing thats the biggest problem..
The last time I've owned a 50 tier card was a GTX 950 when it was new and at the time it was alright for my use case like I've used that card for almost 3 years until it was starting to feel lacking in newer games. 'It was a heavily factory oced model tho, like about the same as a stock 960 but still cheaper and with a solid cooler on it and apparently its still alive at my friend as a backup card:)'

Could I pull off the same with a 5050 as in using it for almost 3 years and even play new games on it?Yeah that I would seriously doubt tho..:oops:
 
Not watercooled? Disappointing.
 
As for the card itself, I wish they had sacrificed some performance for a lower power draw. The 3050 consumed significantly less power than the 3060, but this is essentially a 5060 from the power consumption point of view.

View attachment 406580

While I'd like the designers of each card within the same family (Blackwell or RDNA4, etc) to implement a consistent efficiency profile within that family, you can still achieve similar if not identical efficiency simply by limiting or increasing power in Afterburner/Adrenalin. Or by clock speed at the low end if the power sliders are artificially limited.

For example: I have an RTX 3070 and 3060, located on opposite sides of the efficiency spectrum. But only at stock power limits (220W and 170W, respectively). If I scale the power to the core count and run the 3060 at 135-140W, then it matches the efficiency of the 3070. Yes, that makes the 3060 run a bit slower but everything is relative.

You could claim the 3070 is core-heavy, thus hamstrung by its 220W and forced to run efficiently by that artificial limitation and IMO that is 100% accurate. Or you could claim the 3060 is core-shy and is given more power to try to make up the difference and IMO that is 100% accurate.

So I choose what I run my GPUs at because I like efficiency and (with one exception because I think I got a brown sample, opposite of golden sample) all GPUs in the same family can easily achieve the same efficiency when run at the same relative power or clock speeds.

I'd do the same with this 5050 if I had it, probably run it at 100W or so. With an undervolt of course, I'm not a heathen.
 
While I'd like the designers of each card within the same family (Blackwell or RDNA4, etc) to implement a consistent efficiency profile within that family, you can still achieve similar if not identical efficiency simply by limiting or increasing power in Afterburner/Adrenalin. Or by clock speed at the low end if the power sliders are artificially limited.

For example: I have an RTX 3070 and 3060, located on opposite sides of the efficiency spectrum. But only at stock power limits (220W and 170W, respectively). If I scale the power to the core count and run the 3060 at 135-140W, then it matches the efficiency of the 3070. Yes, that makes the 3060 run a bit slower but everything is relative.

You could claim the 3070 is core-heavy, thus hamstrung by its 220W and forced to run efficiently by that artificial limitation and IMO that is 100% accurate. Or you could claim the 3060 is core-shy and is given more power to try to make up the difference and IMO that is 100% accurate.

So I choose what I run my GPUs at because I like efficiency and (with one exception because I think I got a brown sample, opposite of golden sample) all GPUs in the same family can easily achieve the same efficiency when run at the same relative power or clock speeds.

I'd do the same with this 5050 if I had it, probably run it at 100W or so. With an undervolt of course, I'm not a heathen.
Of course, people on this forum are far more likely than the general populace to tweak their cards, but this and the 5060 are GPUs that will be primarily used by people who have neither that knowledge nor the inclination to seek it. As such, there should have been a bigger difference between the 5050 and the 5060's power consumption at stock. Comparing this review to that of the 5060 shows that the 5050 clocks about five to eight percent higher than the 5060. Dialling the clocks back to match its larger sibling would have allowed it to be as efficient.
 
Even MSI nerfed it's Gaming Trio from 5 to 4 heatpipes, 5080 Ventus has 4 heatpipes too, the Plus has 6.
Some of the 9070 XT has 4 heat pipes as well. Power Color Hell Hound and Reaper, can't remember. Over 300Watt cards so I'm not surprised of anything.
 
Back
Top