• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GALAX Designs a GeForce GTX 1650 "Ultra" with TU106 Silicon

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,853 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA board partners carving out GeForce RTX 20-series and GTX 16-series SKUs from ASICs they weren't originally based on, is becoming more common, but GALAX has taken things a step further. The company just launched a GeForce GTX 1650 (GDDR6) graphics card based on the "TU106" silicon (ASIC code: TU106-125-A1). The company carved a GTX 1650 out of this chip by disabling all of its RT cores, all its tensor cores, and a whopping 61% of its CUDA cores, along with proportionate reductions in TMU- and ROP counts. The memory bus width has been halved from 256-bit down to 128-bit.

The card, however, is only listed by the Chinese regional arm of GALAX. The card's marketing name is "GALAX GeForce GTX 1650 Ultra," with "Ultra" being a GALAX brand extension, and not an NVIDIA SKU (i.e. the GPU isn't called "GTX 1650 Ultra"). The GPU clock speeds for this card is identical to those of the original GTX 1650 that's based on TU117 - 1410 MHz base, 1590 MHz GPU Boost, and 12 Gbps (GDDR6-effective) memory.



View at TechPowerUp Main Site
 
They might have gotten it for cheap and disabled all of the extra cores since Nvidia thought that graphics chip was garage so sold it for cheap.
 
NVIDIA themselves did this for early 16 series qualification samples. Both the 1660 and 1660 Ti qualification boards are physically a 2060 with a flipped subsystem ID and a tiny encrypted EFI block that locks them from being reflashed to a 2060.
 
1650 and ultra in the same sentence.

Next generation must be close if they're clutching at this many straws.
 
Galax wanted more chip volume. NVidia said, "well, we've got this pile of rejects over here... make us an offer."
 
Next generation must be close if they're clutching at this many straws.

This is a GALAX move, not an NVIDIA move, perhaps GALAX are clutching but I'd wager NVIDIA are yet to break a sweat.
 
1650 Ultra or 1660 Ultra?
 
They might have gotten it for cheap and disabled all of the extra cores since Nvidia thought that graphics chip was garage so sold it for cheap.
It's called "binning". NVidia had a lot of dies that had defects in some areas, but were perfectly functional otherwise.
 
Calling it "Ultra" implies a different SKU from Nvidia despite GALAX's supposed brand name and I am sure they are doing it on purpose. This is dumb and misleading when there are already an insane amount of 1650/1660 SKUs.
 
But...but... what if a few of those rays peek through the encryption? Free RT!
 
Back
Top