• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why don't people use processing to generate our heat

Speaking strictly from the heat generation standpoint, aren't computers very inefficient? Correct me if I'm wrong, but isn't heat the result of wasted electricity? For example, a 90% efficient power supply draws 100w from the wall. 90w goes to the components, and the other 10w is lost as heat due to inefficiencies. I imagine components work in much the same way. As the current flows, most of it is actually used to power the components... some of it is lost as heat, because we don't have 100% efficient systems.

Therefore, using computers to generate heat is a bad idea. It's inefficient. Using a heater designed specifically to throw off heat is more efficient than loading a computer to throw off heat. Heaters are cheap compared to computers, and the energy they use is actually being used to heat the room. That said, if you have a computer generating a lot of heat anyway, finding a way to use the heat output wouldn't be such a horrible idea. I used to have heaters like this at one apartment I lived at. It was basically a big radiator that ran along the length of the wall. The fins were electrically heated and dumped heat into the room. Perhaps it would be possible to use some heatpipes to connect something like that to your heatsink base, using the heater as a heatsink. I imagine such a project would be costly and inconvenient, however, even if it did work.
 
The OP's idea has more merit in a business environment. If I could capture the waste heat from my server room and blow it around the office, that would put a noticeable dent in our heating bill every winter. Efficiency isn't an issue here because those servers are running 24/7/365, and their heat is a useful by-product (during the cold months, anyway). As it stands now, we're using air conditioners to dump that heat outside.
 
Speaking strictly from the heat generation standpoint, aren't computers very inefficient? Correct me if I'm wrong, but isn't heat the result of wasted electricity? For example, a 90% efficient power supply draws 100w from the wall. 90w goes to the components, and the other 10w is lost as heat due to inefficiencies. I imagine components work in much the same way. As the current flows, most of it is actually used to power the components... some of it is lost as heat, because we don't have 100% efficient systems.

Therefore, using computers to generate heat is a bad idea. It's inefficient. Using a heater designed specifically to throw off heat is more efficient than loading a computer to throw off heat. Heaters are cheap compared to computers, and the energy they use is actually being used to heat the room. That said, if you have a computer generating a lot of heat anyway, finding a way to use the heat output wouldn't be such a horrible idea. I used to have heaters like this at one apartment I lived at. It was basically a big radiator that ran along the length of the wall. The fins were electrically heated and dumped heat into the room. Perhaps it would be possible to use some heatpipes to connect something like that to your heatsink base, using the heater as a heatsink. I imagine such a project would be costly and inconvenient, however, even if it did work.
The other 90w of power is also lost as heat in your system as computers are 100% efficient at turning electricity into heat (a small percentage of that technically goes to spinning the fans). Computers don't draw enough power to heat as efficiently as a higher wattage heater.
 
The other 90w of power is also lost as heat in your system as computers are 100% efficient at turning electricity into heat (a small percentage of that technically goes to spinning the fans). Computers don't draw enough power to heat as efficiently as a higher wattage heater.
My two GTX780s heat my room up even if i leave my window open and its 0c outside. It's like a nice comfortable temp in ny room.
 
My two GTX780s heat my room up even if i leave my window open and its 0c outside. It's like a nice comfortable temp in ny room.

I remember when my 6970s used to do that Both cards ran at just under 90'c (stock reference cooler) when playing BF3
 
For many years I've considered things like this, and many different capacities. For instance having a battery cell in the trunk of the vehicle which collects energy generated by both the vehicles tires spinning and in particular the friction combined with the spinning when the brakes are applied. There's so many sources of energy wasted similar to that one. I remember in my old home watching my Oil furnace burner exhaust, and the dryers exhaust blowing hot air out into the environment and thinking I wish it could've been vented under my driveway to keep ice from forming.

But at the end of the day at least here in the US companies don't want customers, they want CONSUMErs,in other words people who we buy an item consume it and dispose of it, therefore requiring a new item to replace the first.

Which is bad for the environment, bad for the consumer, and although it's profitable for the company in the beginning in the short term bad for the company toO
 
The other 90w of power is also lost as heat in your system as computers are 100% efficient at turning electricity into heat (a small percentage of that technically goes to spinning the fans). Computers don't draw enough power to heat as efficiently as a higher wattage heater.
In my present case my PC is 90% efficient at 24/7 simulations and 100% just right to heat my room.
A heater would be more efficient at heating but it's shit at crysis let alone running Sims.
You shouldn't look upon it solely on heat output , with the right use case it's doubly efficient.
 
I agree that an enterprise environment would be better suited for this theory. Judging by how much heat this equipment puts out in both our Main Closet and our Server Room, it seems backwards that instead of utilizing the waste heat, we blow A/C in there 24/7. In theory this would make sense, but with most technology budgets, it is easier said than done. It seems at Datacenter level this is already used in some places: http://www.greenbuildingadvisor.com/blogs/dept/building-science/using-server-farms-heat-buildings
 
Datacenters should be built adjacent to office buildings. In the winter, they could use that electronic heat to offset heating costs.
 
Back
Top