• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Plans RTX 3050 A with Ada Lovelace AD106 Silicon

Nomad76

News Editor
Staff member
Joined
May 21, 2024
Messages
1,497 (3.59/day)
NVIDIA may be working on a new RTX 3050 A laptop GPU using an AD106 (Ada Lovelace) die, moving away from the Ampere chips used in other RTX 30-series GPUs. While not officially announced, the GPU is included in NVIDIA's latest driver release and the PCI ID database as GeForce RTX 3050 A Laptop GPU. The AD106 die choice is notable, as it has more transistors and CUDA cores than the GA107 in current RTX 3050s and the AD107 in RTX 4050 laptops. The AD106, used in RTX 4060 Ti desktop and RTX 4070 laptop GPUs, boasts 22.9 billion transistors and 4,608 CUDA cores, compared to GA107's 8.7 billion transistors and 2,560 CUDA cores, and AD107's 18.9 billion transistors and 3,072 CUDA cores.

While this could potentially improve performance, it's likely that NVIDIA will use a cut-down version of the AD106 chip for the RTX 3050 A. The exact specifications and features, such as support for DLSS 3, remain unknown. The use of TSMC's 4N node in AD106, instead of Samsung's 8N node used in Ampere, could potentially improve power efficiency and battery life. The performance of the RTX 3050 A compared to existing RTX 3050 and RTX 4050 laptops remains to be seen, however, the RTX 3050 A will likely perform similarly to existing Ampere-based parts as NVIDIA tends to use similar names for comparable performance levels. It's unclear if NVIDIA will bring this GPU to market, but adding new SKUs late in a product's lifespan isn't unprecedented.



View at TechPowerUp Main Site | Source
 
Doesn't make any sense. Why not 4550, or 4050S, or 5040, 5050, 5030, 5010 or something?
 
It's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding

Probably psychological, special branch in marketing theory which forces them to think that some users will stay with something that they consider "legendary" or "iconic".
Upgrading an old product with new features in order to make its shelf life move a bit forward...
 
Probably psychological, special branch in marketing theory which forces them to think that some users will stay with something that they consider "legendary" or "iconic".
Upgrading an old product with new features in order to make its shelf life move a bit forward...
I doubt these are "new AD106" chips, more likely enough leftovers in NVIDIA stocks.
 
It's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding

And like the 2050 when compared to the regular 3050, this 3050-A will probably have half the bus width with a similar core count to the 4050, giving significantly lower performance.

Hence giving it the last-gen name so not to sully the "good name" of current-gen products with its very low performance.

Probably psychological, special branch in marketing theory which forces them to think that some users will stay with something that they consider "legendary" or "iconic".
Upgrading an old product with new features in order to make its shelf life move a bit forward...

Just the opposite, see above.
 
  • Haha
Reactions: ARF
It's probably meant to succeed the RTX 2050, which is Ampere based despite the 2000 series branding
Problem is that 2050 is just a cut-down version of 3050 using Ampere, which is also confusing naming
 
A 3050 with GDDR7 and the clocks turned down (40w) would be similar to a desktop 3050... hard to beat for value.
 
Winder how this will compare to Strix halo
 
Seems really silly when there is no 4050. At least it won't be as bad as AMD. They used to be really bad with this during the Terascale and GCN days. The Radeon 200 series had 4 different generations.

Maybe it'll be an OEM only card, where maybe the 3050 is still selling pretty good. Will be curious if drivers allow it to support DLSS3/3.5.
 
Wow 4GB, 1768 CUDA cores and 64 bit bus.
AMD's cheers loudly as RX6400 knocked off it's throne as worst video card ever.
 
What people forget is that the 3050 replaced the 3060 which gave you 2 more GB of VRAM. They even had the nerve to charge more for it too. 3050 and then 4050 for about $200 more than 3060 based laptops from 2021.
 
There's no way it has GDDR6 RAM...

100% chance it has GDDR6, even the craptacular GTX 1630 has GDDR6. Go ahead and name the last dGPU Nvidia released without GDDR.
 
I feel with 4 gigs, it should be a 3030 card, as 4 gig now days is the same as having 2 gigs back when the 1030 was released.
 
Back
Top