• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why don't people use processing to generate our heat

Joined
Mar 10, 2010
Messages
11,880 (2.14/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
I have thought about this for a long time and definitely many doing folding and crunching DO use this to heat there rooms or homes as I do so I thought I would ask your opinions on it since to me it's a very efficient use of energy to create heat , much better than just pushing loads of current through thin tough wire anyway.
My admittedly expensive over clocked pc could quite easily heat a room(defo a flat maybe a house) and does ,I have to adjust processing load so as to regulate my rooms temperature, well that and windows(real ones) but all in all I get heat , it still costs but also some good gets done ,who looses.
Now to my mind if this were a designed system then a house could be heated easily.
 
There are projects in like finland etc IIRC where DCs are poping water to local neighbor hoods and testing heating there homes in exchange for bolting radiators to the side of there house.
 
When my daughter wanted a heater in her room i installed a 2 P server, after 2 weeks i took it out again.......the noise was stopping her sleeping. It was a very effective heater in her room though.
 
When my daughter wanted a heater in her room i installed a 2 P server, after 2 weeks i took it out again.......the noise was stopping her sleeping. It was a very effective heater in her room though.
I get that fully but you were improvising to be fair, Noise is the reason I spent so much on this pc, the more rad attached in any way the quieter it's possible to run your fans.
Think about how effective something made for the task could be.
Since I use earbudz to sleep I should probably shut up or spend more myself:).
 
Because a space heater uses something like 1200+ watts where your average computer is doing good to hit 300w without straining a large GPU. Heater maybe costs $50. 1200+ watt computer would cost thousands.

Space heaters respond to ambient temperatures, computers generally do not.
 
Most modern PCs produce so little heat they're hardly usable as space heaters. An i7 6700/GTX 1070 rig will barely warm a small room, with AMD it gets a little hotter though.
Data centers already warm our houses enough.
 
Cost and efficiency is why.
Cost is the why, there is a double efficiency to compute heating ,but a 1200 watt computer will cost a bit I know but in load terms about 6-700 watts continuously in use is adequate in my experience.
@ford yeah I get what you are saying but like I said a machine made for such a thing with possibly chips made to run hotter and obviously something for the pc to do like folding or crunching.
@Smanci , most modern PCs spend most of their lives doing nothing so yeah well spotted but think outside the box , get the box folding or something else and run it at clocks that produce the most heat not the best efficiency or make a special system
 
The 2 P i put in my daughters room was crunching at 100% using two Xeon X5650. Not overclocked on a Supermicro board it was pulling something like 300w at the wall with no addon GPU.
 
In france we have this parisian startup called Qarnot Computing which developped the Q.Rad, a "smart heater / highend computer" connected to the internet.

The heat production and the electricty bill are free, all you have to do is let this highend computer do cloud computing tasks for third party customers (banks, rendering farms, etc...)

tech specs:
  • TDP 500 w High-end soft heat
  • 600 GFlops Computing heat source
  • 0 dB Totally silent
  • 0 g carbon footprint
  • 0$ no heating bill
  • Dimensions 65x60x16 cm (HxWxP)
  • Heating power 500 W
  • Weight 27 kg
  • Noise 0 dB
  • Power 110/230V AC / 500 W
  • Network RJ45 ethernet
  • Computing units 3 high-end Intel® units embedded
  • Computing power 8 vcores @ 4 Ghz / 16 GB RAM per unit
As for now, the installation of those heaters are only accessible to buildings equipped with fiber chanel, the network bandwidth requirement must be pretty high. Last year, Qarnot Computing signed with Ville de Paris and deployed 350 heaters throughout 100 social housing. Qarnot computing grid is harnessing the power of 1000CPU and is still growing. (source)
 
Last edited:
Most modern PCs spend most of their lives doing nothing so yeah well spotted but think outside the box , get the box folding or something else and run it at clocks that produce the most heat not the best efficiency or make a special system

I don't want to kill my hardware prematurely and suffer from the noise as it still wouldn't be much of a heater :I
I have been thinking about this idea, too, but it just has too many drawbacks to be a good one.
 
My 2 GTX 980's, 2 2600K's and an i3-3220T provide most of the heat for my family room.
I don't want to kill my hardware prematurely
DC does not over tax hardware. Maybe some of those really hot 7970's.
 
If your gaming you might as well, or at least turn your heat down when you game.
 
Folding and Crunching are good causes but my home is heated by gas which is cheaper than using electricity to heat it.
 
It's been thought of to mine bitcoin as subsidized heating.

The general excuse I heard from startups in this field as to why it was impractical is the Chinese ASIC ICs when heated tended to release toxins, and if used in a large scale heating environment, would eventually make you sick or kill you.

No idea if this applies to less strenuous computing activities though.
 
It's been thought of to mine bitcoin as subsidized heating.

The general excuse I heard from startups in this field as to why it was impractical is the Chinese ASIC ICs when heated tended to release toxins, and if used in a large scale heating environment, would eventually make you sick or kill you.

No idea if this applies to less strenuous computing activities though.
From my thinking asics would be too efficient, ideally you want as many transistors as possible in as smaller area ie GPUs or modern high core count CPUs in order to have a small manageable area of intense heat production.
I think it's a bit expensive on the smaller scale and less feasibly efficient on a large scale but sound bar costs.
As for electrical equipment releasing toxins while in use cossh and CE safety marks as well as FCC standards being applied should negate such issues no?.
 
Is this another elaborate setup for an AMD thermal punchline?
 
Is this another elaborate setup for an AMD thermal punchline?
With the system I've got listed?? I've built it up over years so no and, all modern 8+ core CPUs and modern GPUs run hot it's the law
 
From my thinking asics would be too efficient, ideally you want as many transistors as possible in as smaller area ie GPUs or modern high core count CPUs in order to have a small manageable area of intense heat production.

Bitcoin asics are a highly competitive field, and are just as tightly transistor packed as most gpus and many cpus. I think they have them down to 22nm now... they were at 28nm when I left.

As for electrical equipment releasing toxins while in use cossh and CE safety marks as well as FCC standards being applied should negate such issues no?.

Not if you're running it from a high wattage high usage compute standpoint no. FCC won't even certify compute systems that won't run on a 15 amp 120V line. I know this because the bitcoin firm Spondoolies tech ran into this.
 
Bitcoin asics are a highly competitive field, and are just as tightly transistor packed as most gpus and many cpus. I think they have them down to 22nm now... they were at 28nm when I left.



Not if you're running it from a high wattage high usage compute standpoint no. FCC won't even certify compute systems that won't run on a 15 amp 120V line. I know this because the bitcoin firm Spondoolies tech ran into this.
I'm sure European law would not allow toxic emissions by electronic device's I've worked to move device's into cossh compliance, I thought FCC standards might be able to cope with that issue?? Outside my knowledge that I'll admit.
Plus I think it could be a matter of tuning the device and design to suit the application.

Few saying it can't be done so I'll give you an example.

I've been folding for years and will do some more so I have the load it's beyond question.

My old 2 bed apartment was heated to reasonable 21 degree ambient temperature in bleak winters it

It used 1000 ishhh watts max but close to 800 most of the time and cost 50 quid on electric not too much with the gas taken off but more yes.

My pc now has twice the rad and hotter GPUs but uses 680 watts typically flat out with a mild Oc and wow it can wack out some heat.

In my present home the heater dial is a battle ground and my room is a freezer so I adapted and overcame, and my room is never cold.
Also I do have a dual gpu rig that's not bad for gaming and I was folding anyway.
 
Last edited:
Cost and efficiency is why.


This ^^.

All so the UK aint really that cold, and no worth while PC is going keep our house warm when it's some 0-25c.

Be much more efficient to just wrap up :).
 
This ^^.

All so the UK aint really that cold, and no worth while PC is going keep our house warm when it's some 0-25c.

Be much more efficient to just wrap up :).
You don't live in my house :)
In france we have this parisian startup called Qarnot Computing which developped the Q.Rad, a "smart heater / highend computer" connected to the internet.

The heat production and the electricty bill are free, all you have to do is let this highend computer do cloud computing tasks for third party customers (banks, rendering farms, etc...)

tech specs:
  • TDP 500 w High-end soft heat
  • 600 GFlops Computing heat source
  • 0 dB Totally silent
  • 0 g carbon footprint
  • 0$ no heating bill
  • Dimensions 65x60x16 cm (HxWxP)
  • Heating power 500 W
  • Weight 27 kg
  • Noise 0 dB
  • Power 110/230V AC / 500 W
  • Network RJ45 ethernet
  • Computing units 3 high-end Intel® units embedded
  • Computing power 8 vcores @ 4 Ghz / 16 GB RAM per unit
As for now, the installation of those heaters are only accessible to buildings equipped with fiber chanel, the network bandwidth requirement must be pretty high. Last year, Qarnot Computing signed with Ville de Paris and deployed 350 heaters throughout 100 social housing. Qarnot computing grid is harnessing the power of 1000CPU and is still growing. (source)
This is more my thinking.
 
I'm sure European law would not allow toxic emissions by electronic device's I've worked to move device's into cossh compliance, I thought FCC standards might be able to cope with that issue?? Outside my knowledge that I'll admit.

I think it's more an issue that the bitcoin china asics are shit quality, lead-laden monstrosities that don't bother with any certs.

Like I said, probably doesn't apply here. :)
 
Back in the days when i used to have two 6970s in Xfire that ran up to 80-85'c when gaming.....
 
You don't live in my house :)

This is more my thinking.

Then maybe you should change the title ?, as i understand it you mean more than just you and your house.
 
Back
Top