• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GA107-based GeForce RTX 3050 is Real, Comes with 11% Lower TDP, Same Specs

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,775 (7.41/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
When NVIDIA launched the GeForce RTX 3050 "Ampere" based on the "GA106" silicon with specifications that could be fulfilled with the smaller "GA107," we knew that the company could eventually start making RTX 3050 boards with the smaller chip, and they did. Igor's Lab reports that RTX 3050 cards based on GA107 come with a typical board power of 115 W, which is about 11 percent lower than that of the GA106-based cards (130 W).

There's no difference in specifications between the two cards. Both feature 2,560 CUDA cores across 20 streaming multiprocessors, 80 Tensor cores, 20 RT cores, and a 128-bit wide GDDR6 memory interface, holding 8 GB of memory that ticks at 14 Gbps data-rate (224 GB/s bandwidth). The GA106 and GA107 ASICs share a common fiberglass substrate, and hence are pin-compatible for the convenience of board partners, with the latter having a smaller die, so any cooling solution designed for the launch-day RTX 3050 should work perfectly fine with those based on GA107.



View at TechPowerUp Main Site
 
Probably won't reflect to pricing -.-
 
Probably won't reflect to pricing -.-
Actually the crypto miners will favor this variant over the original model since it will reduce electricity costs. Shaving 11 percent off the TDP might reduce the break even point by 3-5 weeks (depending on local electricity prices).

I expect a lower-TDP 3050 to command more on the street than the higher-TDP model.
 
Actually the crypto miners will favor this variant over the original model since it will reduce electricity costs. Shaving 11 percent off the TDP might reduce the break even point by 3-5 weeks (depending on local electricity prices).

I expect a lower-TDP 3050 to command more on the street than the higher-TDP model.


How does this even happen? You're assuming there's going to be some official card rev?

The swaps will be invisible to end-users (and just-in-case NVIDIA has trouble sourcing enough GA107, they will likely keep the higherr TDP coolers for the next 6 months)
 
Actually the crypto miners will favor this variant over the original model since it will reduce electricity costs.
Will it though? Crypto mining does not load the GPU cores, only memory, so if memory is identical, mining wattage won't change whatsoever.
 
As long as triple MSRP is a thing none of this is interesting for the average joe.
 
Actually the crypto miners will favor this variant over the original model since it will reduce electricity costs. Shaving 11 percent off the TDP might reduce the break even point by 3-5 weeks (depending on local electricity prices).

I expect a lower-TDP 3050 to command more on the street than the higher-TDP model.
There is something weird, AMD's cards are much better for mining now but for some reason nVidia's cards are more expensive. Either gamers or miners are dumb.
 
Will it though? Crypto mining does not load the GPU cores, only memory, so if memory is identical, mining wattage won't change whatsoever.
This is somewhat accurate.

The cores are still active, just at greatly reduced voltages and clocks. If GA107 cards have an 11% lower TDP, that is at full core+mem workloads like intensive gaming at higher resolution. It'll still probably translate to a ~5% reduced power consumption when ETH mining. Nothing significant but also any power reductions are welcome.

Other algorithms like ERGO, Autolykos, and Ravencoin all use more GPU core than ETH does, so arguably anyone mining those will benefit more.

There is something weird, AMD's cards are much better for mining now but for some reason nVidia's cards are more expensive. Either gamers or miners are dumb.
I think at this point it comes down to scalpers.
 
I still don't understand why they couldn't make a 3050 under 75 W TDP with no external power connector (and low profile possibilities), and a 3050 Ti with the specs of the real-world 3050.
 
I still don't understand why they couldn't make a 3050 under 75 W TDP with no external power connector (and low profile possibilities), and a 3050 Ti with the specs of the real-world 3050.

They can - they are just saving those for the notebook cores.

AMD is trying to sell a part with the same power consumption in the 6500 XT while offering half the 4k performance (I think the power consumption is fine!)
 
They can - they are just saving those for the notebook cores.

AMD is trying to sell a part with the same power consumption in the 6500 XT while offering half the 4k performance (I think the power consumption is fine!)
AMD took the wrong route of clocking a PCI-e 4x notebook chip through the roof and trying to sell it as a desktop GPU for gaming.

Power consumption is not fine 1. considering the performance level, 2. for people looking for low profile cards for compact systems. I understand that it's a niche market, but still, there's no need to squeeze the hell out of an otherwise crap GPU for 5% extra performance.
 
I really hate subtle revisions and hardware changes, but at least they kept the core counts and clocks the same
 
Back
Top