Matter cannot be created or destroyed.
December 2, 2011 4:37 PM   Subscribe

If you are going to be out of the house for a few days, does it make any sense to turn the heat down? My engineering-OCD has been bugging me on this one.

Let's say you find your house to be comfortable at 66 degrees F, and that it is continuously 32 F outside, and that you are lucky enough to be in a house so well-insulated that it only loses 5 degrees F per day. If you leave the house on Sunday, by Monday it has fallen 5 F to 61, etc etc. etc. and when you return on Saturday night, the house is at 36 F. Then, you turn on the heat on to bring it all back up to 66 F. Is the energy required to bring a cold house and it's cold contents all back up to 66 the same as if you'd just left the thermostat at 66 ?
posted by shipbreaker to Science & Nature (18 answers total) 2 users marked this as a favorite
 
No, because the rate of heat transfer between two spaces is proportional to the temperature difference between them. Your house loses heat faster at a constant internal temperature of 66 deg F than it would at a temperature declining at the rate you mention.

Your house would probably actually lose 5 degrees the first day, 4 the second, 3.5 the third day, etc, and eventually approach the ambient temperature.
posted by zompus at 4:45 PM on December 2, 2011


Your house does not simply leak 5 degrees per day. If you extrapolate your projections a few more days, your house would be colder than the outdoors! No, the amount of energy lost decreases as the house approaches the same temperature as the outdoors. So it would be cheaper to let it lose that heat, and reheat it afterwards.

However, there's other issues to worry about - if you let it get too cold, your pipes will freeze.

Many people do set the thermostat to a low setting when going away for a while. This keeps cost down but also prevents pipes from freezing if the weather changes unexpectedly or if they overestimated their insulation (or they're delayed returning).

The only other consideration is how long it takes to warm up again to a comfortable level upon your return. But that's a problem sweaters can solve.
posted by aubilenon at 4:53 PM on December 2, 2011


Is the energy required to bring a cold house and it's cold contents all back up to 66 the same as if you'd just left the thermostat at 66 ?

No, it'll be less.
posted by one more dead town's last parade at 4:54 PM on December 2, 2011 [1 favorite]


Agreed. The same can be said on the way back up in temp. The cooler the house is, the less total heat it will be losing, per unit time, to its surroundings. As the length of the trip increases, this tendency becomes more and more obvious.

From an optimization standpoint:
-where the trip = 0 days, the strategies are equal
-where the trip approaches inf days, "turn off" increasingly dominates

The same is true for air conditioning in hot weather with the added issue that traditional A/C units actually get less efficient as you turn down the thermostat.
posted by milqman at 4:56 PM on December 2, 2011


Turning the heat down is good, but you wouldn't want to turn it off. The danger is that the pipes would freeze.
posted by Chocolate Pickle at 4:57 PM on December 2, 2011 [1 favorite]


I can't speak to the physics, but I asked about this some random location on MetaFilter a few years ago. I had heard from someone that it uses more energy to turn down your thermostat when you're out of the house, because when you raise it, you have to heat everything to the same temperature and that demands more energy. I just wasn't so sure about that, and comparing notes here, I heard someone say they'd heard that same thing from a heating technician, and yet invariably, when they turned their thermostat down, they used less heating fuel and saved money. Over several years, I can say the same is true for me - turning it down results in less fuel use and cost savings. I can only conclude that also means that it's not taking as much energy to keep up with the fluctuations as it would to keep a steady temperature even when you're not there.

Floppy and vague, but empirically, it seems better to turn down the thermo when you're not at home. You could, of course, run this test yourself.
posted by Miko at 5:21 PM on December 2, 2011


Here is one way to think about it. Suppose the thermostat is at 66F during the day and you turn it down to 60F at night. To make the situation easier, suppose the house cools from 66F to 60F very quickly and likewise heats from 60F to 66F very quickly in the morning. Then the "heat savings" overnight is equal to the difference between how fast the house loses heat when it is at 66F and how fast it loses heat when it is at 60F. This is basic thermodynamics.

One implication of this is when you add insulation, this lowers the rate of heat loss (at every temp), which is clearly a good thing. But it also lowers the "heat savings" from turning down the thermostat at night.
posted by notme at 5:21 PM on December 2, 2011


I'd be happy to be proven wrong, but I've never seen an authoritative answer to this question. There's just too many variables that are different from house to house. The answer depends on:

- The temperature gradient between indoors and outdoors (which fluctuates night/day and as the house cools)
- How well insulated the house is
- How efficient the furnace is

I think the only way to do it is to test it empirically on your house and see what works for you. This is an engineering problem, not a physics problem.
posted by auto-correct at 6:28 PM on December 2, 2011 [2 favorites]


Turn the heat down. I can't remember the exact article now, but they showed that it cost less energy/ money to raise the heat back up than to keep it high. In fact, it is more efficient to turn it down every day while you are out of the house.
posted by catatethebird at 6:48 PM on December 2, 2011 [1 favorite]


With very few exceptions it is cheaper to lower the heat whenever you are out of the house. This is pretty similar to the argument about whether it costs more to turn off a light and back on, or keep it on.

The exceptions are homes that are particularly well insulated or have a high thermal mass heating system. In Vermont that would be R20 in basements, R40 in walls, R60 in the roof, at least triple glazed windows. In other climates the insulation levels are lower, but not levels you'd have without knowing you'd done a specific project to achieve them.

The other exception is radiant heat concrete floors. This is more an issue of reasonableness and comfort than energy consumption. These systems heat by maintain a slab temperature a few degrees higher than the air so that heat lost to the outside is steadily replaced. If allowed to cool it takes hours for the slab to warm back up because of mass and in the meantime there is no way to warm the house. So, not reasonable to set those systems back unless you'll be gone for at least several days.

On the other hand, significant savings can be had in almost every type of house by setting the temperature down daily when you go to sleep and again when you leave the house for work. This is why programmable setback thermostats are almost always a good investment when used properly. In an "average" U.S. climate and heating scenario you can save over a hundred dollars a year by simply setting the temperature down during the parts of the day you leave the house. Of course, even more can be saved by setting down when on vacation.
posted by meinvt at 6:58 PM on December 2, 2011


Turn it down. As noted above, heat loss is proportional to the difference in temperature between the house and the outside. Say the average outside temperature is 40, and you keep it at 70. Every day, the energy you use to maintain that temp is (forgive what seems like mixed units, but meteorologists use the term. I'm an engineer.) thirty degree-days. If you're gone four days, that's 120 degree-days of energy your heating system uses.

Now, say you put the thermostat down to 50. Now you're using (50-40)*4= 40 degree days. That's a lot of energy saved. "But wait," you exclaim, "what about warming the house back up to 70? That's a lot of energy to account for!" - and you'd be right. Let's just say your heating system had to run for six hours straight to get things back to 70. Furthermore, let's say that the system only has to run 1/4 of the time once it gets to a steady state. So now it's running continuously, four times as much heat input per hour, for six whole hours. OK, (70-40)*(.25 days)*(4 times the duty cycle)=30 degree days worth of energy. So you've used 40 while on vacation, and 30 to get the place back to normal...that's 70 degree-days worth of energy. Still just more than *half* what you'd use leaving the thermostat up.

As the average outside temp gets colder, the effect is less (since the furnace has to work harder for longer), but as the length of your vacation increases, the effect is stronger.
posted by notsnot at 7:04 PM on December 2, 2011


(As an aside, the bit about heat loss being proportional to the difference in temperature is Newton's Law of Cooling; and the standard textbook example is estimating a murder victim's time of death based on the corpse's and room's temperatures. THE MORE YOU KNOW)
posted by hattifattener at 11:54 PM on December 2, 2011


I'd be happy to be proven wrong, but I've never seen an authoritative answer to this question. There's just too many variables that are different from house to house.

No, there really aren't any that matter. Really. Turning the heat off or down and allowing the interior temperature to move closer to the exterior temperature for some period is always an energy win. It might not be a comfort win but it will always be an energy win.
posted by flabdablet at 7:14 AM on December 3, 2011


The commonly accepted (Where I am, anyway, about 8 miles south of the Mason Dixon Line) is that the minimum ambient temp to leave a house is 55 degrees, as it is the minimum temperature to avoid damage. My house hits 55 degrees every night and 55 every day when we're not here, and then rises to 66 when we're here and active. I have a decently insulated red-brick with new windows.

For real-world anecdata, we use forced air gas heat and supplement with electrics in specific rooms, because I'm not lucky enough to have zonal heating and heating one room is always smarter than heating the whole house. Since we installed the programmable thermostat and added a heated mattress pad on my bed and an electric heater in my sons room, our gas bill has dropped appreciably. I don't have an MCF number for you, and our gas and electricity are cheap here, but it's been very appreciable...~40% of our total gas bill after adding the added electricity cost. This is our third winter in this house, we switched mid-way through the first winter after bugging our eyes out looking at the previous owners winter bills and our estimate/use.

I realize this isn't exactly what you're looking for, but 55 is your important number.
posted by TomMelee at 7:36 AM on December 3, 2011


Even if the rate of thermal loss were the same between warm-warm and warm-cold, you would at best break even. The energy used to keep the house warm over time T (where T is big enough that the house would otherwise cool) must be equal or greater than the energy used to get the house from cold to warm, otherwise there's heat energy coming from nowhere.

So even in some bizarro-world where houses have super-high thermal mass and Newton's second law doesn't hold, conservation of energy can show that this is wrong.
posted by miyabo at 1:51 PM on December 3, 2011


No, there really aren't any [variables] that matter. Really.

There are a few, they're just not likely to be relevant here. The main one I can think of is that some forms of heating can be more efficient than others— specifically, some houses use a heat pump for heating, but if the differential becomes too large, then an electrical (or, conceivably, gas) heater comes on to keep up. So there could be some situations where the heat savings from letting your house cool down a bit are offset by the less-efficient heating technology used to warm it up rapidly afterwards. You'll use fewer BTUs but some of those BTUs might cost you more. Similarly, if your electric utility has variable-rate metering (common in the UK), you could maybe spend less to keep the house warm overnight with cheaper electricity than to warm it up in the morning with more expensive electricity.

However, I really doubt either of those apply in very many real-world situations. Unless you've actually measured your heat loss and done the math, I think you're better off turning the thermostat down whenever you don't actually need the house to be warm.

some bizarro-world where houses have super-high thermal mass and Newton's second law doesn't hold

Newton's Law of Cooling. It isn't one of Newton's Three Laws (of Motion).
posted by hattifattener at 2:49 PM on December 3, 2011


One other thing: the same logic doesn't hold up for cooling, because an air conditioner is more like a heat pump than a furnace. It's theoretically possible that for some houses, it's cheaper to leave the AC on while you're at work. I think this is very very unlikely, but unlike the heat situation it is physically possible.
posted by miyabo at 7:29 PM on December 3, 2011


Seems to me to be less likely to be possible than in the heating case. What assumptions lead to continuous cooling being cheaper?
posted by flabdablet at 6:12 AM on December 4, 2011


« Older PTBFR Allergy: Wha?   |   Like Letters from Iwo Jima, but in book form Newer »
This thread is closed to new comments.