Friday, February 2nd 2024

NVIDIA GeForce RTX 3050 6GB Formally Launched

NVIDIA today formally launched the GeForce RTX 3050 6 GB as its new entry-level discrete GPU. The RTX 3050 6 GB is a significantly different product from the original RTX 3050 that the company launched as a mid-range product way back in January 2022. The RTX 3050 had originally launched on the 8 nm GA106 silicon, with 2,560 CUDA cores, 80 Tensor cores, 20 RT cores, 80 TMUs, and 32 ROPs; with 8 GB of 14 Gbps GDDR6 memory across a 128-bit memory bus; these specs also matched the maximum core-configuration of the smaller GA107 silicon, and so the company launched the RTX 3050 based on GA107 toward the end of 2022, with no change in specs, but a slight improvement in energy efficiency from the switch to the smaller silicon. The new RTX 3060 6 GB is based on the same GA107 silicon, but with significant changes.

To begin with, the most obvious change is memory. The new SKU features 6 GB of 14 Gbps GDDR6, across a narrower 96-bit memory bus, for 168 GB/s of memory bandwidth. That's not all, the GPU is significantly cut down, with just 16 SM instead of the 20 found on the original RTX 3050. This works out to 2,048 CUDA cores, 64 Tensor cores, 16 RT cores, 64 TMUs, and an unchanged 32 ROPs. The GPU comes with lower clock speeds of 1470 MHz boost, compared to 1777 MHz on the original RTX 3050. The silver lining with this SKU is its total graphics power (TGP) of just 70 W, which means that cards can completely do away with power connectors, and rely entirely on PCIe slot power. NVIDIA hasn't listed its own MSRP for this SKU, but last we heard, it was supposed to go for $179, and square off against the likes of the Intel Arc A580.
Add your own comment

65 Comments on NVIDIA GeForce RTX 3050 6GB Formally Launched

#26
Lew Zealand
This card has a single reasonable use, like the RX 6400 and the 1650 (and some Arc 380s):

Upgrade a decent (Core i7) older office PC with a proprietary PSU (common for years now). Hey that's good, keep something outta the landfill and even a 10 year old Haswell i7 will be a good match in most games. But:

It's a 3040, just freakin' name it that and stop the obfuscation.
Too expensive, needs to be $30-50 cheaper.
Posted on Reply
#27
kapone32
bugTbh the low-end has rarely had good $/perf products. A few good ones here and there, but it's mostly a "why bother instead of going IGP" decision.
The 6600M was good, if you were willing to get one.
Posted on Reply
#28
Double-Click
Hot turd on a PCB shingle...
Also why the hell doesn't the FTC slap Nvidia around for these terribly deceptive naming practices?
Posted on Reply
#29
Random_User
azraelThe graphics card absolutely no one asked for...
Absolute waste of resources. Why? Can't wait to see the AIB slap dual tripple fan coolers on top of this garbage.
pressing onNVIDIA's response to the AMD Radeon 780M graphics unit in the 8700G. They want buyers to weigh up whether they should go for the 8700G at $325 or the 3050 6GB for $180 with a CPU costing $145. The RTX 3050 6GB is so gimped that the performance gap might be a lot tighter than NVIDIA was banking on.
What to weight up, Sir? Nothing personal, but this is disaster. This crap takes more space, than performance it has. Almost two generations old lowest end garbage. There's not a single advantage of this card over 780M. I dunno why Tesnsor cores are present here at all, considering they've cut down the original memory from 8GB to just 6... in 2024.
Also, who in the right mind would consider this at 55% of the price of entire 8 core APU. The cheapest R7 7700 non X costs $315 on amazon. Any other card for $180 would be better (PowerColor RX6600 costs $200 at Amazon). This is the product for inveterate fanboys.
bugI doubt that very much. People looking at the 8700G and people looking to build a system with a dGPU aren't likely to switch sides. There are very, very few uses for a system equipped with a 3050 these days. It's mostly aimed at pre-builds sold to clueless buyers.

Edit: If Nvidia gets Lenovo or HP to build some systems around these, there's a few hundred thousands SKUs sold. Easily.
Yeah, this is complete scam. There's no way to sell this card outside prebuilt, or deliberate taking advandage over non savvy people. Might as well push RTX bull*sh*t narratives for this purpose.
DavenI think its safe to say that there will not be a 4050. That means the 4060 will be the budget GPU for for the next gen 5000 series. If that happens, the 5060 Ti might be the lowest priced part.
4060 is already a rebadged 4050. But you're right, at this pace nVidia will rebrand 4060 as bottom, and slap an even higher price tag, just because it's 5000.
kapone32For me they are trying to advantage of of the Ray Tracing and DLSS narrative. This card has worse specs than my laptop 3060 from 2021.
LOL. Indeed. This is the first reason that comes to mind. Capitalize on raytracing feature, that doesn't make sense, for cards lower than 4070 Super.
wNotyarDThis new 3050 should be stronger that a 3050 Mobile which is even more power constrained, and that one sits a bit below the 570. So yeah, RX580 to GTX1650S-level performance is the most we should expect of this.
The entire fact of releasing dGPU card of RX 570 level in 2024 is absolute slap on the face. There sould be progress, not regress. At this point, the absolute lowest card, that nVidia and AMD should have release, should be 3060/6600. Not this waste of precious silicon.
Double-ClickHot turd on a PCB shingle...
Also why the hell doesn't the FTC slap Nvidia around for these terribly deceptive naming practices?
Actually both of them. AMD is as guilty as nVidia. Even worse, because AMD they gouging their trustful fanboys, which take much smaller market share, and mostly are not as wealthy as Nvidia ones. This was red flag, when AMD has dropped RX 6500XT on the market.
Posted on Reply
#30
wNotyarD
Random_UserThe entire fact of releasing dGPU card of RX 570 level in 2024 is absolute slap on the face. There sould be progress, not regress. At this point, the absolute lowest card, that nVidia and AMD should have release, should be 3060/6600. Not this waste of precious silicon.
You know this silicon would be wasted either way, right? If it comes so long down the stack, it's because it couldn't be used for anything bigger with higher margins. So NVIDIA is literally selling scraps for whatever they can fool someone into spending. So either it becomes this waste of sand card, or it would go directly into the waste bin.

For what it is worth, this crap is actually stronger (and although RT is completely unreasonable with such low-end hardware, it is capable of any upscaler out there - especially DLSS but without FG) than anything else in its power envelope. However it is priced (and named) with absolute lunacy. RT 3040, $120 at most.
Posted on Reply
#31
Random_User
wNotyarDYou know this silicon would be wasted either way, right? If it comes so long down the stack, it's because it couldn't be used for anything bigger with higher margins. So NVIDIA is literally selling scraps for whatever they can fool someone into spending. So either it becomes this waste of sand card, or it would go directly into the waste bin.
Still a waste of PCB and components. How about sell these leftowers to someone, who can repurpose it as some ASIC, without spending on entire VGA production. I mean, some RAID controllers, chipset, or even as GPU for a car. But still not at $180.
AFAIK, once upon a time, silicon companies, were throwing in trash any chips, that didn't have at least 50% of functionality. Because it was more expensive for them to put it on PCB, than to throw them away. And production lines can be used for something more viable instead.
Posted on Reply
#32
RayneYoruka
bugTbh the low-end has rarely had good $/perf products. A few good ones here and there, but it's mostly a "why bother instead of going IGP" decision.
I got it under MSRP where there is 24% of tax on top of it so I can't really complain.
My old streaming rig had a gtx 1080 so it's still an upgrade xD

Aside from that not really, no reason to bother, httpc? just get a gt 1030, it can decode 4k30 HDR just fine... or just go the IGPU route directly...
Posted on Reply
#33
deadmeme
theoutoApologies, I see gigabyte and my mind immediately erases the post from my memory.
That’s funny, honestly, don’t blame you. Gigabyte has been mid recently
Posted on Reply
#34
THU31
If this was a 4050 based on Ada with exactly the same specs, it would actually be great for AV1 encoding. But this is just pointless.
Posted on Reply
#35
efikkan
bugJeez, in 2010 that kind of money bought you a 460. And that was a seriously good card.
And it's not like there have been any kind of inflation since 2010 either…, oh wait. :P
AssimilatorWaiting for manufacturers to completely miss the point of this card and lump it with dual-slot full-height coolers and ancillary power connectors.

Manufacturers, get a clue. Low-profile and/or single slot, purely PCIe slot-powered, or stop wasting our time.
Even assuming this thing only pulls 70W, that is fairly close to the PCIe-spec to deliver 75W, considering GPUs can peak above their rating, and I don't trust the power delivery on all the cheaper motherboards when hooked up to various things. So having the power connector is desired for many, but it would be great if it was optional.
azraelThe graphics card absolutely no one asked for...
Well the naming which feels wrong considering how much the lower specs look compared to the earlier RTX 3050s.

Anytime there is an article about low-end GPUs, there are many who fail to understand there are uses for GPUs beyond demanding games, including (but not limited to):
- Replacing defective GPUs in an old system, whether it's an old OS or new. (Ampere is the last generation to support Win 7.)
- Upgrading a GPU to gain newer API support, but not play demanding games.
- Workstations or even servers which have no integrated graphics.
- Adding support for more monitors/high-resolution displays.
- Getting newer codec support (which is almost a must for 4K YouTube), luckily Ampere supports AV1 decode, which is becoming more important.
- Someone building a system for low-end gaming, emulation, etc.
- Anyone who wants a smooth desktop experience without any need for high GPU performance.
Posted on Reply
#36
bug
efikkanAnd it's not like there have been any kind of inflation since 2010 either…, oh wait. :p
Adjusted for inflation $180 in 2010 would be ~$250 today. Can I get card as solid as 460 for $250 today?
Posted on Reply
#37
efikkan
bugAdjusted for inflation $180 in 2010 would be ~$250 today. Can I get card as solid as 460 for $250 today?
You mean an "okay" mid-range card? (not a GTX 460(?) performer I assume)
I would argue RTX 4060($300) and RX 7600($270) fit a similar price point. Yes that's a little more than the ~$250 you calculated with the official inflation, but those inflation estimates from both the US and several other central banks are more than a little skewed by using subjective metrics. If inflation were calculated the way the US did in the 1970s (which is the way you probably think it works), the official inflation numbers would be much higher, possibly ~70-80%.
Just look at pricing on most kinds of hardware, especially motherboards and accessories like fans, and in the past 3-4 years cases have become pricey. I'm not excusing Nvidia here, but just ascertaining that everything has gotten a lot more expensive lately (without diving into politics).
Posted on Reply
#38
Lew Zealand
bugAdjusted for inflation $180 in 2010 would be ~$250 today. Can I get card as solid as 460 for $250 today?
I like to compare how the 460 did in games at the time it as released, maybe at sub-1080p to be fair but 1080p is reasonable. And some people here argue that's not fair as the games are different. Yeah the cards are different, too. I'm interested if we're getting more performance at that price point (plus inflation) in 2024 as we did in 2010.

However if you look at the $229 1GB 460's reviews, the tests are dominated by older games so it's dishing out over 100 fps in many of them at 1680x1050, but when you look at newer and taxing games it's getting about 30 fps. In the 4060 review that's getting ~54 fps at 1080p in 2 games and everything else is over 60. Even at 1440p a single game dips below 40 fps.

The 1GB $229 GTX 460 would be $320 today making it more expensive than the 4060 and it's slower for the demanding games of its day. The $199 768 MB GTX 4060 ($280 in today's money) is behind the 1GB 460 by 5-10%, making it relatively even slower than the $270 RX 7600 (2% behind the 4060) as another comparison.

The 4060 is at least as solid as the 460. We just have soooo many tiers about the regular '60 level nowadays (8 levels!) and nothing below so it feels like entry-level. Entry level is now used or previous gen.
Posted on Reply
#39
bug
Lew ZealandI like to compare how the 460 did in games at the time it as released, maybe at sub-1080p to be fair but 1080p is reasonable. And some people here argue that's not fair as the games are different. Yeah the cards are different, too. I'm interested if we're getting more performance at that price point (plus inflation) in 2024 as we did in 2010.

However if you look at the $229 1GB 460's reviews, the tests are dominated by older games so it's dishing out over 100 fps in many of them at 1680x1050, but when you look at newer and taxing games it's getting about 30 fps. In the 4060 review that's getting ~54 fps at 1080p in 2 games and everything else is over 60. Even at 1440p a single game dips below 40 fps.

The 1GB $229 GTX 460 would be $320 today making it more expensive than the 4060 and it's slower for the demanding games of its day. The $199 768 MB GTX 4060 ($280 in today's money) is behind the 1GB 460 by 5-10%, making it relatively even slower than the $270 RX 7600 (2% behind the 4060) as another comparison.

The 4060 is at least as solid as the 460. We just have soooo many tiers about the regular '60 level nowadays (8 levels!) and nothing below so it feels like entry-level. Entry level is now used or previous gen.
Here's the thing, I had the 460 and it played everything I threw at it at 1920x1200, no sweat. I do know how to fiddle with graphics settings fwiw. I almost pulled the trigger on a 4060, so I could have told you if it can do the same, but in the end I decided I'm not getting my money's worth and didn't.

Another thing is, in 2010 1920x1080/1200 was about the highest resolution available*. In 2024, that would be 4k.

*just like today, there were higher res monitors, albeit rare/expensive
Posted on Reply
#40
tussinman
I remember 2 years ago the 3050 was a laughing stock. Like $300 USD for a card that could barely outperform a 2016 era GTX 1070. Nvidia in 2022/2023 actually started reselling and remaking the 2060 which was cheaper and faster than the 3050 that's how bad it was.

Shocked that it's not only coming back in 2024 but there actually going to make it even slower ?? (Might as well just start reselling the 2060 again or just call the 2060 the 3050 6gb)
Posted on Reply
#41
sLowEnd
To begin with, the most obvious change is memory. The new SKU features 6 GB of 14 Gbps GDDR6, across a narrower 96-bit memory bus, for 168 GB/s of memory bandwidth. That's not all, the GPU is significantly cut down, with just 16 SM instead of the 20 found on the original RTX 3050. This works out to 2,048 CUDA cores, 64 Tensor cores, 16 RT cores, 64 TMUs, and an unchanged 32 ROPs.
Nvidia's page for the 3050 has it pegged at 2304 cores, not 2048

www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3050/
Posted on Reply
#42
john_
AMD slow card. Full of negative reviews.
Nvidia slow card. Silence.

RX 6500XT close to RX 6600 pricing? "What a garbage card".
RTX 3050 6GB close to RX 6600 pricing? "There is NO RX 6600. RTX 3050 is faster than RX 6500XT, so it is good"!!!!


It's really funny until you realize that this card will become a sales hit, selling to those dreaming "ray tracing from Nvidia". More funny that those will end up using AMD's FSR 3 Frame Generation to somehow get acceptable fps in some games with raytracing enabled.
Posted on Reply
#43
Beginner Macro Device
bugHere's the thing, I had the 460 and it played everything I threw at it at 1920x1200, no sweat. I do know how to fiddle with graphics settings fwiw.
And I have a GPU that's not very noticeably faster than 4060, namely a 6700 XT. I play at 4K. Almost everything is comfortable (within my VRR 48 to 60 range), everything is playable (albeit I didn't try Lords of the Fallen, this must be the one game my GPU can't run at 4K even at the lowest preset). I don't know what you are trying to achieve. 1440p gaming is now more affordable than 1080p gaming 15 years ago. 4K gaming is probably a couple percent more taxing but we're about to get there in a year's notice.
john_RTX 3050 6GB close to RX 6600 pricing?
Yes, it's a complete rubbish GPU if we're only talking $ per FPS. But it's by far the strongest <75 W offer on the market. 6500 XT draws more power, 6400 is extensively bad even if paired with a PCI-e 4.0 system. Intel Arc GPUs aren't ideal either. Professional GPUs are rare and expensive. AMD could easily gimp their RX 7600 clocks and shader count to the point it draws <75 W power and it would be it but they didn't arse themselves. NV did. Not to mention that DLSS is a key in such low tier units. Upscaling can be avoided if you go higher tier GPUs but not with 3056 tier video cards. And DLSS is much better than FSR.
Posted on Reply
#44
efikkan
bugHere's the thing, I had the 460 and it played everything I threw at it at 1920x1200, no sweat. I do know how to fiddle with graphics settings fwiw. I almost pulled the trigger on a 4060, so I could have told you if it can do the same, but in the end I decided I'm not getting my money's worth and didn't.

Another thing is, in 2010 1920x1080/1200 was about the highest resolution available*. In 2024, that would be 4k.

*just like today, there were higher res monitors, albeit rare/expensive
In relative performance, GTX 460 was great because it had ~66% of the performance of GTX 480, while only costing 40% of it, and also performing about the level of the previous top models, so it was a meaningful performance level at the time.

Comparing this to the current lineup, a RTX 4060 performs about ~33% pf RTX 4090, and the price difference is huge. The big question then becomes, should we judge RTX 4060 based on how "weak" it is compared to its bigger brother, or should we consider what kind of gaming experience will this actually offer for the money?

Another important aspect since you're bringing up Fermi is longevity. Fermi wasn't just prone to overheating, it quickly became "obsolete" both due to performance demands, changes in performance characteristics and API support. This was actually pretty much the norm for every generation up to then; if someone bought a pretty good GPU, then 2-3 years later there would be many games they couldn't play or at least play well. And I had many cards in that era, like GTX 680, GTX 580, Radeon HD 4870, GTX 275 (traded), 9800 GT, 7800 GT, 6600 GT(SLI), and many before that all the way back to the Riva TNT2. And all these cards have one thing in common, they become "obsolete" long before they deserved to, either due to performance or API support/features.

But starting from Maxwell(900 series), the cards have had remarkably good mileage in them. Especially Pascal (like the GTX 1060 I see in your profile), will probably go down as legendary in terms on long-term value. Granted, the performance difference isn't as great per generation, but the APIs don't change erratically either. Maxwell, Pascal, Turing and Ampere all seem to have lasting value, especially for anyone who bought xx70 models or up. In retrospect it's easy to see that during this time-span buying one or two tiers up would have gotten people more mileage, vs. back in the GTX 460 days where everything was quickly obsolete anyways. It's also worth mentioning that anything from Maxwell and on, still have mainline driver support with features and improvements, that would not have been the case with 10 year old cards back in 2010.

But is it fair to demand 4K in a ~250$ card today?
The main reason why the high-end cards today can do 4K well is that they're using massive GPUs, drawing huge amounts of power, that are several "tiers" higher than back in the GTX 460 days. We also have higher refresh rate, so the reality is that a lower mid-range card today can usually do 1080p at a good frame rate or 1440p at lower frame rate or lower details.

My assessment is that RTX 4060 is a little too poor of a deal, but not terrible.
If I were looking to build a gaming PC today, and wondering how far to stretch my budget (how expensive GPU can I justify), I would probably go a good step up. RTX 4060 is already a little weak today, RTX 4060 Ti is of course better, but the jump up to RTX 4070 is huge, and especially now with the price cuts, I'd say go for RTX 4070 or the two compelling models from AMD: RX 7700 XT and RX 7800 XT. Those three models I believe are the "sweetspot" GPUs today, in terms of getting a great experience today and some longevity.
Posted on Reply
#45
Readlight
Interesting, what student for 200 euro will get.
Posted on Reply
#46
bug
@efikkan Fwiw, I don't even factor in the xx90 cards. I consider them the successors to the former Titans. In my head, Nvidia's lineup only goes to xx80, with the xx90 thrown in there just to test where people actually stop buying.
Posted on Reply
#47
3valatzy
pressing onNVIDIA's response to the AMD Radeon 780M graphics unit in the 8700G.
More like Nvidia's response for the RX 6400 4GB (120$) and RX 6500 XT 4GB (155$).
john_AMD slow card. Full of negative reviews.
Nvidia slow card. Silence.

RX 6500XT close to RX 6600 pricing? "What a garbage card".
RTX 3050 6GB close to RX 6600 pricing? "There is NO RX 6600. RTX 3050 is faster than RX 6500XT, so it is good"!!!!


It's really funny until you realize that this card will become a sales hit, selling to those dreaming "ray tracing from Nvidia". More funny that those will end up using AMD's FSR 3 Frame Generation to somehow get acceptable fps in some games with raytracing enabled.
Prima.VeraThis should have been a single slot, small factor form card for 99$, the right price. The rest is just nGreedia TAX.
Nvidia is the premium brand. Has anyone ever criticised Apple, for example, and Samsung for their sky rocketing phone prices?
Of course, it's pretty clear that those who defend Nvidia are dependent, both financially and maybe ideologically.
Posted on Reply
#48
Prima.Vera
3valatzyHas anyone ever criticised Apple, for example, and Samsung for their sky rocketing phone prices?
YES
Posted on Reply
#49
efikkan
bug@efikkan Fwiw, I don't even factor in the xx90 cards. I consider them the successors to the former Titans. In my head, Nvidia's lineup only goes to xx80, with the xx90 thrown in there just to test where people actually stop buying.
Yeah, I tend to agree with that.
It's nice that they do exist, but I would only consider them for a heavy professional workload that justifies that.

As for the realistic options; Back with Ampere, the RTX 3060 / Ti models were very well positioned at the time (only you couldn't find them at the MSRP then). Currently the 4060/Ti models falls a little short, but are still great products in terms of what they are offering, and I believe you can at times find them below MSRP.
But I still think RTX 4070 along with RX 7700 XT and RX 7800 XT are the models to consider for a "decent" gaming PC today.
Posted on Reply
#50
Beginner Macro Device
efikkanCurrently the 4060/Ti models falls a little short, but are still great products in terms of what they are offering
Non-Ti, almost yes. A little cheaper and 1.5 times more efficient version of 3060. For $250, would be marvellous.
Ti (8 GB), no. I don't have anything against the device itself, it's just horribly overpriced. For $330, would be 10/10.
Ti (16 GB) makes no sense for anyone but those who do some professional workloads where this speed is plenty but 8 GB is "are you kidding me."
Posted on Reply
Add your own comment
Jun 13th, 2024 01:31 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts