• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Galaxy Intros Single-slot GeForce GTX 750 Ti Razor Graphics Card

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Galaxy launched its single-slot GeForce GTX 750 Ti Razor graphics card in Europe, under the Galax and phasing-out KFA2 brands. This the second such single-slot GTX 750 Ti cards, after ELSA launched one such card, in certain APAC markets. Galaxy's card uses a typical nickel-plated copper channel heatsink, which draws heat from a copper plate, which makes contact with the GPU, memory and VRM. The card relies entirely on the PCI-Express 3.0 x16 bus for its power. Display outputs include one each of D-Sub, dual-link DVI, and HDMI. The card ships with clock speeds of 1020 MHz core, 1080 MHz GPU Boost, and 5.40 GHz (GDDR5-effective) memory. Available now, the card is priced at 139.90€, including taxes.



View at TechPowerUp Main Site
 
The hell is that DVI port? Why not a full DVI port? This card is what a lot of people want, but that DVI port is a bit of a bummer.
 
The hell is that DVI port? Why not a full DVI port? This card is what a lot of people want, but that DVI port is a bit of a bummer.

What's wrong with DVI-D?
 
Just flashing the pansies I see. Awesome card, if there wide availability or easy way to get it. Why d-sub? How about DP+2xHDMI? or mini DP + 2x mini HDMI. Ok, dual link DVI is alright, but that dsub, come on- 2015 is knocking on the door.

Edit: while writing 2 other people commented on the ports...wow :)
 
What's wrong with DVI-D?

Doesn't work with converter cables. In the event somebody wants to run a VGA>DVI converter with this card, they can't because it's missing the 4 pin holes.

2-port-dvi-switch-with-rca-digital-coaxial-audio-[5]-766-p.jpg
 
why not write the name off the card which is labelled KFA2 not Galaxy even it's the same computer, KFA2 is just one of Galaxy's brands ;)
 
Doesn't work with converter cables. In the event somebody wants to run a VGA>DVI converter with this card, they can't because it's missing the 4 pin holes.

you got a dedicated VGA, so no need for adapters and if u only have old VGA screens maybe it would be time for an upgrade on your screens too? ;)
 
Doesn't work with converter cables. In the event somebody wants to run a VGA>DVI converter with this card, they can't because it's missing the 4 pin holes.

If they want VGA, there's a port right there...I still don't see what the problem is.

Ninja'd by puma I see...
 
you got a dedicated VGA, so no need for adapters and if u only have old VGA screens maybe it would be time for an upgrade on your screens too? ;)

Valid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.

If they want VGA, there's a port right there...I still don't see what the problem is.

People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I agree with everyone, VGA should die, but the fact is it hasn't so it is still a requirement until panel makers stop making VGA panels.
 
Valid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.



People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I see your point...I use 2 monitors as well, one using DVI and one using VGA (an old monitor) with an adapter, but I can see the problem if there is someone out there who uses 2 VGA-only monitor. I'm glad my card came with at least one DVI-I so I can use an adapter
 
Valid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.



People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I agree with everyone, VGA should die, but the fact is it hasn't so it is still a requirement until panel makers stop making VGA panels.

i know this will still exist RCoon but more and more manufactures of SSF computers now use one DisplayPort and one VGA now no DVI which is annoying sometimes, but the reality and maybe in the near future they will start using only DisplayPort and no analog connection...
 
Oh, that thing from Computex this year. Nice to see it finally rolling out. Wonder if that fancy-pants Darbee card is right behind.

As for the ports, dual link DVI-I would've been better for usage versatility. I can see people still using dual CRTs and/or el-cheapo LCDs that only have VGA inputs. But for business use? Isn't a GTX 750 Ti a little over-the-top for most business applications? And the few that aren't generally require specialized hardware.
 
750ti? Why? It's time for 950(ti).

If they made a GTX 950Ti right now it would be the same as the GTX 750Ti. They would both be 28nm Maxwell. Possibly they will do this with the die shrink to 20nm next year.
 
I love it!!!

Though there are two things I would love to see still

1: DP
2: a low profile variant

Bit of a high request but I would think it is possible to do the low profile variant basing it off what other cards I have seen like that. Either way though I love the single slot cards like this because they make me wanna build a tiny computer.
 
I love it!!!

Though there are two things I would love to see still

1: DP
2: a low profile variant

Bit of a high request but I would think it is possible to do the low profile variant basing it off what other cards I have seen like that. Either way though I love the single slot cards like this because they make me wanna build a tiny computer.

maybe that will come with a GTX 960 or 950/Ti ;)
 
at 1080p with a modern cpu this little card will run ALL games above 40 fps on high settings. don't let the hardware snobs talk down to you. this card with that small footprint and power usage is great for 95% of PC gamers out there.
 
galaxy cards for me always get too hot and seem to take way too much power. i had to take two back in the past, so i stick with pny. Nice looking card though
 
Fine idea, but ... cooling system probably sounds like jumbojet during take off. That's biggest downer.

Nice addiction for Media PC or something along that lines, but noise of tiny fan. Thanks but no thanks.
 
galaxy cards for me always get too hot and seem to take way too much power. i had to take two back in the past, so i stick with pny. Nice looking card though

I can't see how it can take way too much power when there is no 6-pin connector; it draws all power through the slot.
 
It's interesting why they released it now? You can see that it's written on the PCB - 1350... which means - 50th week of 2013 ... so super old card released 1 year after.
 
Didn't they turn into "GALAX" ?
 
Valid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.

The 750Ti, as well as most other recent cards, only have one analog output. Since this card already has a dedicated VGA port, it would be impossible to make the DVI port a DVI-I port, it has to be DVI-D. There is physically no second analog output from the GPU to connect to the DVI port.
 
Valid point, but the reality is some people will need that ability. At my work place we buy low power basic GPUs for multi-screen setups. For us, that DVI-D instead of DVI-I would be a huge deal breaker. Those requirements still exist, regardless of how ancient they are.



People will run more than just 1 VGA monitor. Most basic corporate business buy cheap monitors, most of which still only have VGA.

I agree with everyone, VGA should die, but the fact is it hasn't so it is still a requirement until panel makers stop making VGA panels.

there exists HDMI to vga convertors!
 
Back
Top