Friday, February 7th 2020

Palit Releases GTX 1650 KalmX - a Passively Cooled, 0dB GPU

Palit has today released the latest addition to its KalmX passive series of graphics cards - the GTX 1650 KalmX. This graphics card is an ITX-sized, 178 mm long GPU that is designed for zero noise, passively cooled builds, where noise is the primary factor. With a heatsink consisting out of two heat-pipes and many fins, the cooling solution should be capable of cooling the 75 W TDP of the GTX 1650 GPU. The cold plate of the heatsink covers the GPU and VRMs to provide safe GPU operation. Being based on the reference design, this card features default speeds of 1485 MHz base, and 1665 MHz boost clocks. For IO, Palit opted to include three ports, where one is HDMI 2.0b and the other two are DisplayPort 1.4a. All the power needed is provided by the PCIe slot, so there are no external power connectors.
Add your own comment

60 Comments on Palit Releases GTX 1650 KalmX - a Passively Cooled, 0dB GPU

#26
lexluthermiester
TheDeeGee0 dB, so that means zero Coil Whine?
Palit is one of the brands that care about acoustics and as a result very few of their cards experience coil-whine.
Posted on Reply
#27
gamefoo21
AssimilatorNice to see no DVI on this card.



Yes, if it's designed properly. And GTX 1650 has sub-75W power draw versus RX 5600 XT's up-to-225W. That's a third of the heat to dissipate.

Of course, the point of that GN video isn't passive cooling, it's the continued fallout from AMD's half-a**ed 5600 XT launch. How difficult is it to make fans spin at a certain speed, for crying out loud? And yet AMD and its partners have managed to break that simple functionality. That's a stunning display of incompetence, even for RTG.
I trust Steve's reporting on this about as far as he can walk on water.

He slurped the big MSI cawk, and went deep with supporting their shit cards and how it turns out that MSI actually cheaped out on the PCB which is why the new clocks won't work.

If bought you into his pitch, then he's just reaffirming what he was raging about. Also just so happens to help that he's shitting on one of MSI's main competitors. That's just a coincidence though... *cough*
Posted on Reply
#28
cucker tarlson
gamefoo21I trust Steve's reporting on this about as far as he can walk on water.

He slurped the big MSI cawk, and went deep with supporting their shit cards and how it turns out that MSI actually cheaped out on the PCB which is why the new clocks won't work.

If bought you into his pitch, then he's just reaffirming what he was raging about. Also just so happens to help that he's shitting on one of MSI's main competitors. That's just a coincidence though... *cough*
did 12gbps work correctly?
Posted on Reply
#29
gamefoo21
cucker tarlsondid 12gbps work correctly?
It did, but it didn't make MSI initially commit to supporting the update, then seeing Asus go ohhh here's some boards with the X swapped to Z and a bigger price tag.

Then MSI hammered the brakes, back tracked, used super lame excuses, adopted the Asus strategy and threw AMD under the bus to cover up their own shit.

Is Steve's video a timely response to Powercolor going oh MSI cut corners on the 5600XT PCB so that's why it likely won't work long term. Or that look at these 14Gbps memory chips we put on our cards.

That wasn't a direct shot at Steve's 'source' quote that big bad AMD forced them to use the cheapest ram.

I also dislike his click bait titles, which he's been doing a lot lately.

Note:

I watch Steve for entertainment, but I don't trust him as a sole source, but he can certainly keep on trying to. :)
Chloe PriceIsn't the first Twin Turbo like S1 with Turbo Module from factory? I have the Twin Turbo 1gen on my GT 1030 and it's a "little" overkill.. :rolleyes:
I ended up selling my S1 ages ago. My turbo module is the two fans wired to go to a 3 pin header.

I actually have it mounted on my Thermalright HR-03 that was cooling a 380X.

The 1030 is a gutless wonder, I have a half height passive Gigabyte one in my parents HP thin box. It only uses I think 4 PCIe lanes, and according to the sensors maxes out at 30 or 40W... The S1 could keep that chilly fully passive with ease.

I have a full pipe RX 560, that is draw limited to 75W. I'd love to mount an S2 on and let it live passive.
Posted on Reply
#30
Assimilator
gamefoo21I trust Steve's reporting on this about as far as he can walk on water.

He slurped the big MSI cawk, and went deep with supporting their shit cards and how it turns out that MSI actually cheaped out on the PCB which is why the new clocks won't work.

If bought you into his pitch, then he's just reaffirming what he was raging about. Also just so happens to help that he's shitting on one of MSI's main competitors. That's just a coincidence though... *cough*
Lemme get this straight:

AMD told partners to spec their 5600 XT cards for 12Gbps memory.
MSI designs cards that support 12Gbps memory. Conforms to the spec
Powercolor designs cards that support 14Gbps memory. Unnecessary, but conforms to the spec.
Before the launch of said card, AMD releases a BIOS update that runs the memory at higher speeds than originally specified. Thus changing the spec they originally provided.

Yet somehow, MSI is at fault because they didn't design their cards to cater for AMD's stupidity.
Yet somehow, MSI is evil because they claimed that not all RX 5600 XT GPUs can handle 14Gbps memory, but didn't mention the number of PCB layers.
Yet somehow, Steve is part of an MSI CONSPIRACY because he accepted what MSI told him in lieu of other information.
Yet somehow, Steve is no longer reliable because he chose to tear down a card from MSI's "competitor" who "called them out".
Yet somehow, Powercolor failing to validate that the BIOS update they released ACTUALLY WORKS WITHOUT COOKING THEIR OWN CARD, makes them more trustworthy than MSI.

Your conspiracy theory idiocy is so incredibly, utterly, brain-hurtingly stupid that I'm amazed you're capable of breathing.
Posted on Reply
#31
natr0n
Chloe Price-Buys a passive card
-Macgyvers a fan

But why?! :D
When I see a giant heatsink I feel compelled to strap a fan to it. :D
Posted on Reply
#32
cucker tarlson
gamefoo21It did, but it didn't make MSI initially commit to supporting the update, then seeing Asus go ohhh here's some boards with the X swapped to Z and a bigger price tag.

Then MSI hammered the brakes, back tracked, used super lame excuses, adopted the Asus strategy and threw AMD under the bus to cover up their own shit.

Is Steve's video a timely response to Powercolor going oh MSI cut corners on the 5600XT PCB so that's why it likely won't work long term. Or that look at these 14Gbps memory chips we put on our cards.

That wasn't a direct shot at Steve's 'source' quote that big bad AMD forced them to use the cheapest ram.

I also dislike his click bait titles, which he's been doing a lot lately.

Note:

I watch Steve for entertainment, but I don't trust him as a sole source, but he can certainly keep on trying to. :)



I ended up selling my S1 ages ago. My turbo module is the two fans wired to go to a 3 pin header.

I actually have it mounted on my Thermalright HR-03 that was cooling a 380X.

The 1030 is a gutless wonder, I have a half height passive Gigabyte one in my parents HP thin box. It only uses I think 4 PCIe lanes, and according to the sensors maxes out at 30 or 40W... The S1 could keep that chilly fully passive with ease.

I have a full pipe RX 560, that is draw limited to 75W. I'd love to mount an S2 on and let it live passive.
isn't it far fetched to discredit gamersnexus for what in the first place was amd's lack of professionalism ?
they fulfilled the specs,didn't they?
it seems a plausible theory that amd doesn't get as much love from msi as nvidia,but I don't think they're at fault for not supporting 14gbps through a last minute update provided 12gbps worked fine.

I don't know why gamernexus is the object of so many conspiracy theories among tpu members.
Posted on Reply
#33
Keullo-e
S.T.A.R.S.
AssimilatorNice to see no DVI on this card.
That's mean for us DVI users.. :(

My second monitor is a HDMI one, but I need to use it with a DVI-HDMI cable since Oculus Rift takes that HDMI connector.
Posted on Reply
#34
gamefoo21
cucker tarlsonisn't it far fetched to discredit gamersnexus for what in the first place was amd's lack of professionalism ?
they fulfilled the specs,didn't they?
it seems a plausible theory that amd doesn't get as much love from msi as nvidia,but I don't think they're at fault for not supporting 14gbps through a last minute update provided 12gbps worked fine.

I don't know why gamernexus is the object of so many conspiracy theories among tpu members.
I am simply of the opinion that Steve is human. He makes money, has bills has employees, and so he does what he does. All humans have biases it's part of the human condition.

AMD had to respond to Nvidia. They couldn't drop the price, and they validated shit on their end for the faster toys. So AMD gave people a 5650XT without charging for it. What were there other options?

They obviously can't tank the price of the cards or they would.

They know for a fact if 5600XTs started dying left right and center they'd be blamed for it hard. They tested the design guidelines, and deemed it safe.

Unprofessional ehh... For the followers of PC Jesus maybe... Should they not have said so much at CES, probably.

Let's not forget that the 2060 that now shows up in those price comparison guides is severely limited. It's built on failed 2070 dies with bits lasered off. So it eats more power, has a crappier cooler, and stock hasn't exactly been plentiful.

MSI went oh yeah our not shit versions of the 5600XT will be getting bios upgrades. Asus said nope we are releasing a faster and more expensive version. MSI went oh more money! Then they went on a rampage. Steve suddenly gets onside parroting what MSI told him and how he can't handle his image being tarnished.

MSI trying to cover up for their about face is unprofessional.

Steve suddenly forgetting about MSI putting out several fails of cards and coolers and parroting about MSI's superior and thorough verification process... Then the quotes from 'sources'...

Steve suddenly deciding about workstation performance when he comes into possession of a pair of V100 cards. Then he suddenly cares about it for the Gimped 2060, how it's banging if you care about workstation performance. While actively pushing his worst hardware of 2019 swag... Oh and the GPU in it shits all over his Team Green QuadRTX 5K, but suspiciously no Radeons in his workstation testing because apparently only CUDA enabled GPUs can do that stuff... His bias on full display, whether reinforced with *cough* gifts *cough* or not it should remind all of us to not sole source info.

Remember

Steve takes/gets gifts/samples from, negotiates directly with the manufacturers/suppliers/sellers/vendors for that marketing moneys/perks/samples/etc, doesn't disclose a lot of it and when pushed has a hissy fit or claims it's to protect his sources. He still only pushes Thermal Grizzly.

I watch his videos but I don't trust him like he's a benevolent god like so many seem to.

I have a firmly middle view of him. If I hated and wanted to tear him down I'd say he's as trustworthy as Fox n Friends... I trust him more than Linus, so there's that.

Edit: 2060 KO out of stock on Newegg and Amazon US editions.
Posted on Reply
#35
CheapMeat
YESSS! Great addition to have out there. Going to try to buy one myself.

I personally have zero issue with air flow for any of my builds because I use Rosewill 4U cases with straight, wind-tunnel-like air flow powered by 6 x 120mm fans in front of the motherboard.

I just love the industrial design of passive stuff.
Posted on Reply
#36
Hyderz
Kalm is first town after you exit Midgar in FF7 :)
Posted on Reply
#37
Vayra86
AssimilatorLemme get this straight:

AMD told partners to spec their 5600 XT cards for 12Gbps memory.
MSI designs cards that support 12Gbps memory. Conforms to the spec
Powercolor designs cards that support 14Gbps memory. Unnecessary, but conforms to the spec.
Before the launch of said card, AMD releases a BIOS update that runs the memory at higher speeds than originally specified. Thus changing the spec they originally provided.

Yet somehow, MSI is at fault because they didn't design their cards to cater for AMD's stupidity.
Yet somehow, MSI is evil because they claimed that not all RX 5600 XT GPUs can handle 14Gbps memory, but didn't mention the number of PCB layers.
Yet somehow, Steve is part of an MSI CONSPIRACY because he accepted what MSI told him in lieu of other information.
Yet somehow, Steve is no longer reliable because he chose to tear down a card from MSI's "competitor" who "called them out".
Yet somehow, Powercolor failing to validate that the BIOS update they released ACTUALLY WORKS WITHOUT COOKING THEIR OWN CARD, makes them more trustworthy than MSI.

Your conspiracy theory idiocy is so incredibly, utterly, brain-hurtingly stupid that I'm amazed you're capable of breathing.
Thanks for saving me a wall of text.
The lengths people go to to remove blame from AMDs shit GPU practices truly never ceases to amaze. Holy shit.
Posted on Reply
#39
Vayra86
gamefoo21I am simply of the opinion that Steve is human. He makes money, has bills has employees, and so he does what he does. All humans have biases it's part of the human condition.

AMD had to respond to Nvidia. They couldn't drop the price, and they validated shit on their end for the faster toys. So AMD gave people a 5650XT without charging for it. What were there other options?

They obviously can't tank the price of the cards or they would.

They know for a fact if 5600XTs started dying left right and center they'd be blamed for it hard. They tested the design guidelines, and deemed it safe.

Unprofessional ehh... For the followers of PC Jesus maybe... Should they not have said so much at CES, probably.

Let's not forget that the 2060 that now shows up in those price comparison guides is severely limited. It's built on failed 2070 dies with bits lasered off. So it eats more power, has a crappier cooler, and stock hasn't exactly been plentiful.

MSI went oh yeah our not shit versions of the 5600XT will be getting bios upgrades. Asus said nope we are releasing a faster and more expensive version. MSI went oh more money! Then they went on a rampage. Steve suddenly gets onside parroting what MSI told him and how he can't handle his image being tarnished.

MSI trying to cover up for their about face is unprofessional.

Steve suddenly forgetting about MSI putting out several fails of cards and coolers and parroting about MSI's superior and thorough verification process... Then the quotes from 'sources'...

Steve suddenly deciding about workstation performance when he comes into possession of a pair of V100 cards. Then he suddenly cares about it for the Gimped 2060, how it's banging if you care about workstation performance. While actively pushing his worst hardware of 2019 swag... Oh and the GPU in it shits all over his Team Green QuadRTX 5K, but suspiciously no Radeons in his workstation testing because apparently only CUDA enabled GPUs can do that stuff... His bias on full display, whether reinforced with *cough* gifts *cough* or not it should remind all of us to not sole source info.

Remember

Steve takes/gets gifts/samples from, negotiates directly with the manufacturers/suppliers/sellers/vendors for that marketing moneys/perks/samples/etc, doesn't disclose a lot of it and when pushed has a hissy fit or claims it's to protect his sources. He still only pushes Thermal Grizzly.

I watch his videos but I don't trust him like he's a benevolent god like so many seem to.

I have a firmly middle view of him. If I hated and wanted to tear him down I'd say he's as trustworthy as Fox n Friends... I trust him more than Linus, so there's that.

Edit: 2060 KO out of stock on Newegg and Amazon US editions.
What AMD did, and none other, is that it put its AIB partners and ALL of its prospective customers and existing ones in great uncertainty and with often even a less functional product, extra expenses and all just to 'one up' Nvidia which had a more competitive product out. Note, AMD cpuld have also 'jebaited' them with another price cut and everyone would have won. But no, this was deemed the best way forward; creating instability on product out in the wild and already for sale.

This is straight up an ass move. Even in the event you have your working 14Gbps version. How confident are you really about that product?!

For comparison, Nvidia released 9 and 11gbps Pascal card revisions the way it should be done. As a normal, transparant release.
Posted on Reply
#40
cucker tarlson
gamefoo21So AMD gave people a 5650XT without charging for it.
No,absolutely not!
Jesus Christ it's like talking to a brick wall.
They hit the panic button and almost botched the entire launch since they went out of spec a minute before the release.

am I crazy for thinking gpus should be complete and ready producs,and any adjustments should be done via price cuts.shipping the cards with standard bios on launch and releasing a vbios update to hit the marketed performance - just not the way it should ever be.
Vayra86What AMD did, and none other, is that it put its AIB partners and ALL of its prospective customers and existing ones in great uncertainty and with often even a less functional product, extra expenses and all just to 'one up' Nvidia which had a more competitive product out. Note, AMD cpuld have also 'jebaited' them with another price cut and everyone would have won. But no, this was deemed the best way forward; creating instability on product out in the wild and already for sale.

This is straight up an ass move. Even in the event you have your working 14Gbps version. How confident are you really about that product?!

For comparison, Nvidia released 9 and 11gbps Pascal card revisions the way it should be done. As a normal, transparant release.
lol,jebaited is a theme with amd thses days :roll:I don't think this is how business should be concucted.seems incompetent and desparate.
Posted on Reply
#41
MrGRiMv25
natr0nWhen I see a giant heatsink I feel compelled to strap a fan to it. :D
You can use "tweezers" to attach it as well. :oops:
Posted on Reply
#42
notb
londisteIt actually does not matter that much. With a heatsink with fins that are this sparse, any airflow is good. Placing this in a completely closed environment will obviously suck but as long as any airflow is there (be it 300rpm case fan or two), it will manage. The jump from completely passive to a little bit of airflow is usually underestimated.
It really depends. You have to consider how most cheap desktops look. Most entry-level ATX towers ($500-800) come with a SINGLE exhaust fan next to I/O.
Even if you get something more expensive or add an intake fan, majority of airflow is still going to happen over the GPU. It's not magic. Air looks for the easiest way out.

Of course that doesn't mean the card won't work. But it will be much hotter than it could have been. And this may affect performance.

So what I was trying to say: for most people, with typical case setups, a quiet actively cooled card will be a better choice (resulting in lower total noise).
Cheaper as well. Heatsinks are expensive. Palit 1050ti KalmX was among the most expensive 1050ti available.
This card is a good choice specifically for PCs with case airflow that wouldn't work well with a normal cards.
For example: if you had a case similar to Ghost S1, but with solid side wall.
By the way, I am pretty sure KalmX will not fit into Ghost because the card is too wide ;)
Actually, it's smaller than it looks (that's because we're used to dual-fan cards with different proportions).
Palit says 138mm.
MSI says 140mm for GTX 1080 GAMING Z 8G.
Ghost S1 can take GPUs up to 142mm tall. :)

Just a quick check for those unconvinced: paint agrees more or less (perspective makes Palit look a few mm taller).

Posted on Reply
#43
ARF
75 watts according to the reference cards' specification. This radiator might not be actually enough, the likelihood this would throttle in real world usage, is very high.
Posted on Reply
#44
lexluthermiester
ARF75 watts according to the reference cards' specification. This radiator might not be actually enough, the likelihood this would throttle in real world usage, is very high.
That is very unlikely.
Posted on Reply
#45
ARF
Have you seen passively cooled 65 watt processors like Ryzen 7 3700X or Ryzen 7 2700?
Have you ever seen passively cooled 15 watt mobile processors in laptops?

If the answer is no ( I also haven't seen so), then passively cooled videocards are also highly unlikely to function without issues.
Posted on Reply
#46
notb
ARFHave you seen passively cooled 65 watt processors like Ryzen 7 3700X or Ryzen 7 2700?
Passive cooling just means airflow isn't generated by a fan mounted to the heatsink itself - cooling is provided by case setup.
Most servers and many OEM workstations work that way - with CPU/GPU TDP going beyond 200W.
If the answer is no ( I also haven't seen so), then passively cooled videocards are also highly unlikely to function without issues.
Passively cooled 75W cards exist. End of story. Look it up.

No one says this card will work with no forced airflow (i.e. in open air, like smartphones do).
But, frankly, Palit 1050Ti KalmX did pretty well.
Posted on Reply
#47
lexluthermiester
ARFHave you seen passively cooled 65 watt processors like Ryzen 7 3700X or Ryzen 7 2700?
Have you ever seen passively cooled 15 watt mobile processors in laptops?

If the answer is no ( I also haven't seen so), then passively cooled videocards are also highly unlikely to function without issues.
I have had one. Mounted it to a GTX560ti. There was another airflow through case I had at the time that it never got above 75C.
www.overclockers.com/arctic-accelero-s3-passive-graphics-card-cooler-review/
Worked perfectly, as @EarthDog can attest to.

Cards that come from the factory passively cooled have been tested many times to make sure that meet necessary specifications. Palit is no stranger to passive cooling, this is not their first go at it.
notbBut, frankly, Palit 1050Ti KalmX did pretty well.
Exactly
Posted on Reply
#48
londiste
londisteBtw, Powercolor also has a basic 5600XT model with 12Gbps memory: www.powercolor.com/product?id=1577415751#spe
www.pcgamer.com/powercolor-sheds-light-on-why-memory-speeds-differ-on-the-radeon-rx-5600-xt/
"Our Stock model 5600 XT follows AMD's reference specs but we share [the same] memory modules as Red Devil and Red Dragon. You would ask why we don't run at 14Gbps then—the reason is Red Devil and Red Dragon have a higher PCB layer count, 10 versus 8 found on the stock 5600 XT, which means the memory signal is much cleaner and stable with our premium models. Having a 10 layer PCB is important to the higher clocks on the GDDR6 14Gbps memory," PowerColor said.
ARFHave you seen passively cooled 65 watt processors like Ryzen 7 3700X or Ryzen 7 2700?
3700X is a 88W CPU.
2700 can and has been passively cooled - given large enough heatsink it'll be more or less fine. Again, it depends on case airflow.
I personally had an i7-4770 (84W TDP) under Scythe Ninja 2 that worked fine with a couple slow case fans providing the airflow.
Posted on Reply
Add your own comment
Apr 18th, 2024 22:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts