• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4050 "Ada" Launches This June

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,854 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's mainstream GeForce RTX 4050 "Ada" graphics card, which succeeds the RTX 3050, reportedly launches in June 2023. This could end up being the highest-volume SKU in the RTX 40-series desktop lineup. Team green is planning to launch a new desktop SKU every month leading up to Summer. April will see the company launch the performance-segment RTX 4070, followed by the RTX 4060 Ti and RTX 4060 in May, and now we hear about the RTX 4050 in June.

The GeForce RTX 4050 is likely based on a highly cut down version of the 5 nm "AD107" silicon that also powers the RTX 4060 in its maxed out configuration. The AD107, much like the AD106, features a 128-bit wide GDDR6 memory interface. For the RTX 4050, NVIDIA could narrow this down to 96-bit, and give it 6 GB of GDDR6 memory, which is 25% less than the 8 GB of 128-bit GDDR6 memory that's standard for the current-generation RTX 3050. NVIDIA would have worked out the performance numbers, and the RTX 4050 might still end up generationally faster than the RTX 3050 despite this narrower/smaller memory.



View at TechPowerUp Main Site | Source
 
Waiting for the RX 7600 to beat the RTX 4050 in value…
 
3060 performance for $349 get em while they are hot boyz!!!
 
At this point, I wouldn't be surprised if the 4050 ended up being slower than the 3050 with less VRAM while costing more. The things Nvidia gets away with these days are mind-boggling.
 
y/a/w/n///////

Can someone please wake me up when/if they drop to $100, then perhaps I would consider buying one, until then, it's back to my nappy time :)
 
In TPU GPU database power consumption of the near future RTX 4050 is written as 150 watts. Much more than power consumption of RTX 4060(116 watts). Why?
 
I need an "interim" GPU for when and if my GPU dies or I am waiting on a new GPU, let's see if nVidia can price it correctly.
 
AMD thought it was wise to release the RX 6500 XT with only 4 GB. I don't think it worked out sales-wise.
 
They should cut costs by using 6nm instead of 5nm, although it's difficult could try to keep the price close to $200-250.
 
Cute.

Too bad it will be overpriced as nVidia likes to do.
Our GPUs are not overpriced, they are just slightly positioned outside the realm of reality.

Love,
Jensen Huang

And remember kids, the more GPUs you buy the more you save
 
I can`t wait for the 249$ 64bit, 4x PCIE, 4GB RTX4030 and the pinnacle of the pack- the 4010 with 2GB @32bit memory and 2x PCIE for that 149$.
Celebrating competition all year long.
 
So August/Sept - 4090 Ti?
 
So August/Sept - 4090 Ti?


Gamers will get ta 4090ti when Nvidia is unable to sell the full dies to the professional market so probably around 6 months prior to the 50 series launch.
 
In TPU GPU database power consumption of the near future RTX 4050 is written as 150 watts. Much more than power consumption of RTX 4060(116 watts). Why?
Would it really surprise you if it's true?

The 3060 Ti uses just a little more power than the 3060 does when gaming and Max, but uses less or the same in other tested results from W1zzard.

Gaming:
3060 - 181W
3060Ti - 199W

Maximum:
3060 - 179W
3060Ti - 197W

V-Synce 60 Hz:
3060 - 176W
3060Ti - 73W

Idle:
3060 - 13W
3060Ti - 9W

Multi-Monitor:
3060 - 16W
3060Ti - 17W

Video Playback:
3060 - 17W
3060Ti - 17W
 
I can`t wait for the 249$ 64bit, 4x PCIE, 4GB RTX4030 and the pinnacle of the pack- the 4010 with 2GB @32bit memory and 2x PCIE for that 149$.
Celebrating competition all year long.
You jest, but I'd genuinely like to see a raytrace capable successor to the GTX 1650, no PCIe cable needed and all. Us budget gamers gotta eat too! Of course it needs to have GTX 1650 prices too...
 
You jest, but I'd genuinely like to see a raytrace capable successor to the GTX 1650, no PCIe cable needed and all. Us budget gamers gotta eat too! Of course it needs to have GTX 1650 prices too...

RT in low end of course will be great, but where will you use that weak rt performance? 2D games with RT, is it a thing? :D
 
Gamers will get ta 4090ti when Nvidia is unable to sell the full dies to the professional market so probably around 6 months prior to the 50 series launch.
Blackwell is somewhen in 2024. Prob Q2/Q3 again.. if not Q4.
 
Be nice if there is a low profile version.
 
RT in low end of course will be great, but where will you use that weak rt performance? 2D games with RT, is it a thing? :D
Now that I think of it, all that RT is good for in lower end cards is to spice up screenshots, otherwise you'll be playing a slideshow. The real RTX feature that would be great for lower end GPUs is DLSS support...
 
Honestly, if they can get it under 75 watt, with at least 6GB of VRAM, and its faster then the 3050, then I'm interested. My 560x is getting long in the tooth and the media PC only takes LP cards.

If they cant manage that then I'll pick up a used A2000 off of ebay, those are about 3050 speed.
 
Nvidia -------------->:nutkick:<------------ Loyal Nvidia customer
 
Now that I think of it, all that RT is good for in lower end cards is to spice up screenshots, otherwise you'll be playing a slideshow. The real RTX feature that would be great for lower end GPUs is DLSS support...
I would say modern cpu will do better job than low end gpu with rt option. As well you have valid opinion.
 
Back
Top