If my computer is a fan heater, when will my fan heater be a computer?
December 20, 2006 4:19 PM   Subscribe

How efficient is my computer as a means to heat my apartment?

From what I've heard, energy cannot cease to exist, only be turned into other forms. So the question is: what becomes of the energy that goes into my computer?

I'm guessing it goes to two things, some heat and some motion in the air, caused by the cpu fan and etc. So is my computer essentially a fan heater?
posted by cheerleaders_to_your_funeral to Computers & Internet (37 answers total) 2 users marked this as a favorite
 
About as effective as a 100W light bulb, since that's pretty close to the idle power. At full tilt, a modern machine could push 400-500 watts: 4 or 5 light bulbs.
posted by chairface at 4:23 PM on December 20, 2006


Yeah, it's basically a fan heater, and about as efficient, watt for watt, as one of those. The chips do virtually no work in the classical physics sense (a force over a distance) so all of the energy they consume is turned into heat.
posted by kindall at 4:23 PM on December 20, 2006


It's more interesting to consider what energy doesn't get turned into heat. You're sending out some energy on your net connection, more if it's wireless. Flash memory can store a little bit, I guess. I forget the magnetic details of hard drive platters - is there any energy storage there?
posted by TheOnlyCoolTim at 4:26 PM on December 20, 2006


If all your heat sinks and fans are working properly they should be blowing hot air out of your computer (either directly or by displacing it with cool room air). However computer manufacturers do a lot of engineering to keep heat generation to a minimum, but it is noticeable. When chips and drives get too hot the fail, often permanently. Dell et al. don't want to keep replacing dead motherboards so they try and reduce the amount of heat generated, and funnel what does exist out of the computer. The room with my computer is warmer than the rest of my apartment, especially when it's been on for a few days. As far as an efficient heat source, it's not, at all.

You can really heat up a small room with just a few computers, that's why server farms and remote backup sites pay a lot for cooling, but it'll cost a lot more than using a regular heater as computers are not engineered to produce lots of heat, but rather to produce as little as possible.
posted by Science! at 4:29 PM on December 20, 2006


#Science!: it'll cost a lot more than using a regular heater as computers are not engineered to produce lots of heat

Um, the heat produced by any appliance only depends on the energy it consumes. There is no such thing as "an efficient electric heater", though some may contain fans to blow heat in desired directions.
posted by MonkeySaltedNuts at 4:37 PM on December 20, 2006


MSN: I think he meant that computers include all this costly stuff (y'know displays and precision engineering) not optimized to produce heat. That is, buying a laptop to heat your room will cost a lot more than buying a WalMart fan heater to do the same job.

At least, thats how I read it.
posted by vacapinta at 4:40 PM on December 20, 2006


Best answer: Every electrical device in your apartment is 100% efficient at producing heat as long as all the energy produced by the device stays in your apartment (because all that energy eventually turns into heat). There is an insignificant transmission of energy outside your apartment from your computer (e.g. per Tim above: wireless will have a small output, data storage may hold some energy, but it will later be released as long as you don't remove the storage device, sound transmitted outside your apartment will be an energy loss), so it is a quite efficient heater.
posted by ssg at 4:49 PM on December 20, 2006


I would argue that there are ways to heat a room that are more efficient than others. Convection is a much better method of distributing heat than conduction (this depends on the application, but for space heating I argue this is true). So, blowing hot air around is better than, say, a heating coil sitting in the corner.

A computer won't be terribly efficient at heating a room, then, since you don't get a whole lot of airflow (in addition to the fact that chips are designed not to give off excess heat). You will notice it, but you certainly couldn't turn your heat off in the middle of winter and simply rely on your computer for warmth.

A bit of lore used to float around my college when I was still there. It was said, back in the day, that the computer center used to be heated in the winter by the mainframes (back when computers gave off stupid amounts of heat). Nowadays there's a normal heating/cooling system in there.
posted by backseatpilot at 4:54 PM on December 20, 2006


vacapinta hit what I meant. If you were to heat your apartment/house with a conventional heating system it would cost a whole lot less (speaking only of energy cost, not hardware cost) than it would to heat it to the same temperature using nothing but computers.
posted by Science! at 4:59 PM on December 20, 2006


Sort of following on this - Has anyone metered their Mac, or found a site that has? My wife is convinced that our electric bill has gone up because of increased computer use - I never took that seriously, but if its really 400-500 watts while running, maybe I should reconsider.
posted by mzurer at 5:09 PM on December 20, 2006


You will notice it, but you certainly couldn't turn your heat off in the middle of winter and simply rely on your computer for warmth.

I actually did this when my heat went out. Closing the door to my 8x8 Manhattan bedroom and wearing blankets, it worked.
posted by TheOnlyCoolTim at 5:09 PM on December 20, 2006


Very efficient, yes, almost all of the energy ends up heating your dwelling, but not very powerful. When you want to heat your dwelling with equipment I suggest a pair Balanced Audio Technologies VK150SE monoblock amps. Those Russian 6C33C-B tubes are horribly inefficient and throw off tons of heat. At idle, 500 watts, at full throttle 1,000 watts, each. The sound, however, is sweet and glorious. I suggest using your laptop while playing these babies so you can sit in the sweet spot. You won't miss the heat thrown off by your desktop unit with 2,000 watts of heat (300 of which go into the music) filling your room. (yes, I suffer from tube lust, I have some of their gear, but not this gear, and it is truly awesome, yet a flame thrower, not for warm climates)

[Check the prices of electricity and fuel in your area. Usually electricity is more expensive per equivlaent energy unit, even given its 100% heating efficiency versus 90's for good fossil fuel systems. That, and electric generating stations have efficiencies in the mid 30's to 40% with most of the rest just wasted, exhausted into the environment. Don't heat with electricity if you value the environment.]
posted by caddis at 5:11 PM on December 20, 2006


oops, forgot cite1, cite2 (pdf).
posted by caddis at 5:14 PM on December 20, 2006


When I was at the University of Washington there was a 24 hour computer lab that had public computers as well as some bigger mainframe type machines. When they moved the bigger computers out, they had to start heating the place because there was no longer all the heat generated from the old hot computers. This is what I recall hearing at the time, does this ring a bell with anyone?
posted by jessamyn at 5:19 PM on December 20, 2006


yes, yes it does, those old mainframes produced tons of heat. The cooling systems were rather sophisticated.
posted by caddis at 5:25 PM on December 20, 2006


mzurer, loook at the power supply info on the back? that says how much energy it takes to run...
posted by jare2003 at 5:26 PM on December 20, 2006


mzurer, loook at the power supply info on the back? that says how much energy it takes to run...

That label is, at best, an estimation of the peak wattage the supply can handle, and at worst (with a cheap supply), an outright lie.

Power consumption varies depending on what components are in use, so a 500 watt power supply doesn't necessarily draw more than a 250 watt power supply.
posted by cmonkey at 5:32 PM on December 20, 2006


A few of these comments have me pondering this more, maybe I'm thinking about it wrong.

A well engineered computer would use as little energy as possible, and produce as little heat as possible, while a well engineered heater would use as little energy as possible and produce as much heat as possible.

Engineering limitations in computers produce heat, a perfect hard drive would have no friction to generate heat, a perfect CPU would draw only as much energy as needed to perform its work and not give off heat, a perfect power supply would loose no energy as heat when it converts AC to DC. While engineering limitations in heaters fail to produce heat, the burner on an electric stove only glows red because visible light is a by product of its inefficiency, a perfect heat source would lose no energy as visible light.

It's not just that a computer uses less energy than a heater, therefore produces less heat. I'm saying for any given amount of energy a computer produces as little heat as possible because it has other work to do, while a heater produces as much as it can because that's its only job.

Am I thinking about this wrong?
posted by Science! at 6:01 PM on December 20, 2006


jessamyn writes "This is what I recall hearing at the time, does this ring a bell with anyone?"

Some big iron gives off so much heat they are water cooled at water flow rates up to 245 liters per minute.
posted by Mitheral at 6:13 PM on December 20, 2006


The heat produced by your computer is going to vary depending on what you're doing with it. Right now, doing nothing but reading this question, I can stick my hand behind my box and feel cool air coming out of it. After a few hours of getting my hero on, however, I can reach back there and feel considerably warmer exhaust.

Basically, your two biggest heat-generating components in your computer are your CPU and your video card, and they put out heat proportional to the load placed on them. You're not going to get any meaningful heat out of a decently-engineered system without constantly running at 100% load, which will drastically shorten the lifespan of your components. Similarly, poorly-engineered systems get a lot warmer than necessary, which quickly wears down the parts.

To put it shortly, chips don't like heat.
posted by Spike at 6:28 PM on December 20, 2006


In addition to a computer, a halogen torchiere will heat a smallish low-ceilinged room well enough. I found this very useful in an "efficiency" apartment where heat (from an old steam radiator) was included in the rent (and seldom actually working). After a few hours I'd have to crack a window or take something off; my poor cat shed like crazy all winter and spent a lot of time in the chilly bathroom. I'm sure this wasn't an efficient use of energy, but I think it was better than burning plain light bulbs and running a space heater. I'm pretty sure the heat from the computer (and monitor) contributed to this because when I'd turn the computer off I'd start feeling colder -- and this was a puny 386 with a puny power supply.

It helps if your ceiling is painted white to reflect the light and heat from the torchiere back down into the room; I'd thought about increasing this by putting something more reflective on the ceiling above it, maybe aluminum foil, but never got around to bothering. If I were doing that now I'd want something more direct, like a work light, but not if I had to pay for electricity. (And these days my computers are if anything overpowered -- and warmer!)
posted by davy at 6:31 PM on December 20, 2006


#backseatpilot: there are ways to heat a room that are more efficient than others... blowing hot air around is better than, say, a heating coil sitting in the corner.

No. Both are equally efficient in heating a sealed box room. Blowing air might make far parts of a cold room heat up faster but you will initially be much warmer in a cold room sitting next to a glowing heating coil (unless you blow its heat right at you).

For a leaky room, where cold comes in from sources such as windows or other things, you will still be warmest near or blown by a glowing heating coil.

If you have a leaky habitation then I suggest that you invest in something like a $60 Sears Craftsman Infrared Thermometer with Laser Pointer. You can check check the temperature of windows, cracks, mysterious crevices in the bottom of a closet, and such to see where cold comes in.

Various remedies that work for offending ingresses are: 1) putting tape over small ones, 2) taping off a range with a square of garbage bag plastic, 3) using a window sealing kit (which is just a big film of plastic taped to the frame), 4) using a spray can of foam insulator to block up small to medium ingressive sources of cold.
posted by MonkeySaltedNuts at 6:35 PM on December 20, 2006


Oh: setting up the computer in a corner also helps heat that corner, heat that would be dissipated and wasted in the middle of the room. I found this out by unwitting trial and error when my current mother- board's heat sensor would turn the damn thing off.
posted by davy at 6:37 PM on December 20, 2006


As far as computers being a more or less efficient heat generator than any other heater, the way to think about it is conservation of energy. Some amount of energy enters your computer from the wall outlet; any energy that is not stored in your computer exits as either heat or as some other kind of radiation (radio waves from wireless, a little bit of light from LEDs, light from your monitor). There isn't much energy to be stored in a PC - you can store a tiny little bit in flash or in flipping magnetic domains on a hard drive; you can store a little bit more in capacitors and onboard batteries (though these will mostly be charged and so not storing much energy). All the rest will be radiated as heat.

There isn't much energy to be stored in a PC, so because there's no place to store it all the energy that enters the PC must be reemitted as heat. Thus your PC produces pretty much as many watts of thermal heat as watts of electrical power it draws. This is true for most any appliance that doesn't do real work - things like water pumps are an exception because they do real work.

As a final theoretical aside, even a perfect computer without resistive losses and other inefficiencies would still emit heat - there is a necessary entropy increase on bit erasure. See http://en.wikipedia.org/wiki/Landauer's_Principle
posted by pombe at 6:43 PM on December 20, 2006


MonkeySaltedNuts - maybe I chose my words a little carelessly. You will reach equilibrium (and a more uniform heat distribution) with convection than conduction. If all you had was a radiator in a room (or a hot computer tower), you would achieve the same temperature as blowing equally hot air into the room, but at a different rate. Plus, you'd get a nice hotspot near the radiator.

Yes, you'd be warmest sitting on the radiator, but they're not usually very comfortable or in prime TV-watching position.
posted by backseatpilot at 7:02 PM on December 20, 2006


I think the real question is, does the heat come out where you want it, and how you want it. If your heat source is back in the corner, away from where you sit, it isn't heating what matters. And, if you end up overheating a room so that it will stay warm for hours after the heat source has been turned off, the energy losses through the walls will be much higher.

Flash memory can store a little bit, I guess. I forget the magnetic details of hard drive platters - is there any energy storage there?

There is effectively no energy storage. Approximately half the bits are in one of two possible states. Then later, after you move a whole bunch of files around, approximately half the bits are in one of two possible states. It's an interesting notion though :P
posted by Chuckles at 7:22 PM on December 20, 2006


My roommates hated paying for heat, so I ended up in my room a fair amount, with my computer running. I know that they heat the room up.

As far as the absolute efficiency, a computer converts 95%+ of the incoming electricity into heat. A tiny amount goes out over the ethernet wire, or wireless, or out via light. Basically, if you ignore the radiation issue (getting heat to the corners), a space heater and a computer are equivalent. In a small space, that doesn't matter, and I say get as much use out of the electricity as you can, but be sure to make sure the electricity isn't too expensive compared to gas heat.
posted by cschneid at 8:08 PM on December 20, 2006


you certainly couldn't turn your heat off in the middle of winter and simply rely on your computer for warmth.

I have a dual 2.5 GHz Power Mac G5 and a PC in my home office and in the winter, I keep the heat turned off in that room entirely. Only when it falls below 30 do I really need to turn on the auxiliary heater.
posted by kindall at 8:14 PM on December 20, 2006


Can someone please post some credible links on this topic? I've taken my share of physics classes but I'm still not thinking about it the way most of you seem to be.
posted by Science! at 8:44 PM on December 20, 2006


It's not just that a computer uses less energy than a heater, therefore produces less heat. I'm saying for any given amount of energy a computer produces as little heat as possible because it has other work to do, while a heater produces as much as it can because that's its only job.

You're missing that essentially all of the electricity that goes into a computer ends up coming out as heat. There are few other ways for it to escape, and it certainly isn't being stored. You don't need to understand anything about what happens inside a computer, the only rule here is conservation of energy.

(another way to put this: the computing that happens in a computer is a useful side effect of converting electricity to heat, not the other way round)
posted by cillit bang at 9:47 PM on December 20, 2006


Science, when engineers design electronic equipment one of the steps is to do a power and thermal analysis. For the thermal analysis they just calculate or measure the amount of electrical energy going into the device and equate that to the thermal energy they have to remove from the device to keep it from overheating. Energy input is equal to heat output.

You said: It's not just that a computer uses less energy than a heater, therefore produces less heat. I'm saying for any given amount of energy a computer produces as little heat as possible because it has other work to do, while a heater produces as much as it can because that's its only job.

Am I thinking about this wrong?


Yes, you are thinking about this wrong. For a given amount of input electrical energy, a computer converts all of it to heat. Computers are designed to use as little energy as possible to perform their job of computation, but all of it ends up as heat. Perhaps your confusion is that you keep saying that the computer does work. Yes, but only in an informal sense. A computer does no work in a physical thermodynamic sense.
posted by JackFlash at 10:31 PM on December 20, 2006


Get yourself a great big old 21" CRT (I used to have an old NEC Multisync), and run lots of high intensity software with lots of graphics. It used to make my room unbearable in the warm months, and noticeably warmer during the cold ones.
posted by tomble at 12:03 AM on December 21, 2006


Purely anecdotal, but my computer (an 18 month old Dell Dimension 3000 with 17" CRT monitor) causes my very small bedroom (9' x 12') to get very hot, especially in the summer. It throws a nice amount in the winter too, although I live in NC, so it doesn't really get that cold outside.
posted by katyggls at 1:15 AM on December 21, 2006


As has been stated by JackFlash & a few others, the heating efficiency of computers is practically 100%, same as a fan heater. Electrical watts in equals thermal watts out. Air motion barely counts - it's a tiny amount of power and is converted to heat by friction in the air anyway. Light emitted by the monitor, LEDs, etc, is in the milliwatts range or less, compared to 100-400W for the PC.

However, the heating effectiveness may not be as good as a proper space heater because of heat-distribution issues.

The next thing to consider is that refrigerative (reverse-cycle airconditioner) heaters can be more than 100% "efficient" though it's a slightly bogus definition of efficiency. The amount of thermal energy they transfer from the outside of the building to the inside is greater than the amount of electrical energy they consume, therefore you get more heat in your house using this method than you do with a simple resistive element that dissipates electrical power as heat.

Last thing is: cost of energy. Gas is usually cheaper per joule than electricity, so you may be better off heating with that.

As an aside, have a look at Reversible Computing. Erasing information costs energy - there is an increase in entropy; reversible computers (for which it is difficult to construct useful algorithms) don't destroy information so aren't subject to the von-Neumann/Landauer limit. Of course, this energy consumption is the tiniest fraction of the energy dissipated by your computer, but it is interesting to know there is a fundamental quantity of energy required to perform classical computation.
posted by polyglot at 2:11 AM on December 21, 2006


Another thing to note: portable space heaters (for example) may be as efficient as a computer, meaning that the percent energy transmitted to heat is the same, but they are certainly output more total heat than a computer would. Space heaters can output as much as 5000 BTU/hr, which is over 1000 Watts. The average computer isn't putting out even 100 Watts if it's being used lightly. Here's a page that gives some average power usages for home computers (note that CRT monitors themselves use 80 watts! Those are nice little heaters!)
posted by muddgirl at 8:41 AM on December 21, 2006


The next thing to consider is that refrigerative (reverse-cycle airconditioner) heaters can be more than 100% "efficient" though it's a slightly bogus definition of efficiency.

Yes. They are called heat pumps. The problems is that you need to factor in the efficiency of electric power generation. One btu = about 0.29 watthours. However, burn one btu of fossil fuel and even in the most efficient modern electric generators you get about 0.11 watthours. So now having wasted almost two thirds of the energy in that btu you could not find a heat pump efficient enough to make up the difference. Burn that same btu in your modern furnace and get about 0.95 btus of heat into your dwelling.
posted by caddis at 8:53 AM on December 21, 2006


pombe writes "Thus your PC produces pretty much as many watts of thermal heat as watts of electrical power it draws. This is true for most any appliance that doesn't do real work - things like water pumps are an exception because they do real work."

Your refrigerator makes a pretty good space heater too.

tomble writes "Get yourself a great big old 21' CRT (I used to have an old NEC Multisync), and run lots of high intensity software with lots of graphics."

My only reget in moving from a 20" flat glass CRT to a LCD panel is I lost a prime PopTart heating location. I use the power brick for my laptop now but the bloody thing tends to fall under my desk.
posted by Mitheral at 10:40 AM on December 21, 2006


« Older What is the science behind Camel's new non-spit...   |   Help me name our restaurant something other than... Newer »
This thread is closed to new comments.