• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lower Mainstream Graphics Segment Sees Action with Arc A580 and GeForce RTX 3050 6GB

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The lower mainstream graphics segment is considered to be the starting point for PC gaming, targeting 1080p gaming with medium-thru-high (though not extreme) settings, and popular e-sports titles at 1080p with high settings. This segment is preparing to see some action in the coming days, with the introduction of two new products, the Intel Arc A580, and a new 6 GB variant of the GeForce RTX 3050. We've seen the A580 "Alchemist" in development for a while now.

Based on the 6 nm ACM-G12 silicon, the Arc A580 comes with 24 Xe Cores, or 384 EU (execution units), which work out to 3,072 unified shaders, compared to the 3,584 of the A750, 4,096 of the A770, and the significantly lower 1,024 of the entry-level A380. The most interesting aspect of the A580 is its memory. Although 8 GB in size, it uses a wide 256-bit memory interface, and 16 Gbps memory speed, which works out to a generous 512 GB/s of bandwidth. The A580 also comes with a full PCI-Express 4.0 x16 host interface.



Meanwhile, the new GeForce RTX 3050 6 GB in development, is expected to be based on the 8 nm "GA107" silicon, and the "Ampere" graphics architecture. It gets 6 GB of GDDR6 memory, over a narrower 96-bit memory interface. At this point, it is not known whether the RTX 3050 6 GB has a lower CUDA core count than the regular RTX 3050 8 GB, which maxes out the "GA107," as well as comes in variants based on the "GA106."

View at TechPowerUp Main Site | Source
 
3050 6GB - this is what the mobile segment needed long time ago but they released the mobile 3050 4GB, obsolete at release.

yep one of the many reasons I dislike Nvidia, they knew better. and pretty much every mid tier to lower end gaming laptop has a 3050 4gb in it for the last 2 years. its sad. no need for that nonsense.
 
Yeah, no thanks. The value of RTX3050 is SO broken IMO and especially in my country. Second hand offerings are going equal against RTX3060Ti and even RTX3070....also something like Pulse RX6650TX is like what 250 eur brand new? Hard pass IMO.
 
Pure e-waste imo
 
A580 - launch date, price, benchmarks, power consumption? Leaks suggest good performance but is a hot power drinker - not really suitable for a "mid" card. Let's hope data comes out soon.
 
Pure e-waste imo
Indeed, but with one exception. If it comes as a 75w, slot-powered GPU (even slightly cut-down) it will be welcome. Not very optimistic about it, just saying...
 
Indeed, but with one exception. If it comes as a 75w, slot-powered GPU (even slightly cut-down) it will be welcome. Not very optimistic about it, just saying...

Except they did it wrong. Cutting down power-hungry Ampere to 75W vs. power-efficient Ada Lovelace is again giving the middle finger to the low-end buyer.

Here, have better performance than the 1650 but not nearly as much as you could have enjoyed.
 
Except they did it wrong. Cutting down power-hungry Ampere to 75W vs. power-efficient Ada Lovelace is again giving the middle finger to the low-end buyer.

Here, have better performance than the 1650 but not nearly as much as you could have enjoyed.
Well even Ampere has potential for much better power management, as RTX A2000 proves.
In theory previous generation node should mean a more budget friendly GPU (but with Nvidia maybe not:nutkick:). RTX 4000 Ada gen SFF is a very powerful 70w GPU, but the price is very high even for a workstation card.
 
Cool, Nvidia is finally releasing the second RTX 4030! I think it is very anti-consumer to give it the SAME EXACT NAME as the RTX 4040 which is the first RTX 4030. :kookoo:
 
Back
Top