Join 3,501 readers in helping fund MetaFilter (Hide)


Ways to save
February 12, 2010 9:53 AM   Subscribe

I'm trying to save money on monthly bills. I recently read that "40% of all electricity used to power home electronics is consumed while the products are turned off." Is this accurate? If so, what devices or ideas are there to conveniently "unplug" things (whose plugs often have 3 prongs). Related tips for stopping money leaks like this are welcome.

I've seen this blog post with 2-prong outlet switches, but wondered what alternatives there were, and also wondered what other money leaks I didn't know about.
posted by cashman to Home & Garden (26 answers total) 29 users marked this as a favorite
 
I know about power strips.
posted by cashman at 9:54 AM on February 12, 2010


I think that figure is VERY fishy. Maybe it was true 15 years ago. I don't know. But in any case, what you need is one of these power meters. Then you can measure for sure how much your appliances use in all their states - off, suspended, low activity, high activity, etc.

When I did this, I found that almost all of my appliances used between 0 and 1 watts when off. Also, it used to be true that AC/DC converters wasted power when plugged in but not powering anything, but it's not true anymore, at least not for the ones I own for my computer and cellphone (at least, they used less than 1 watt - the meter I had can't measure less than that.)
posted by Salvor Hardin at 10:06 AM on February 12, 2010 [1 favorite]


I would also suggest one of those meters. When I bought one and tested stuff, the electronics that are on all the time used tiny, tiny amounts. Like the router is 2 watts. Since most electronics don't really like being unplugged, since they use that trickle current to maintain the settings memory, I didn't find it worthwhile to do anything about.

The things that I found that were really using power:

1) Refrigerator.
2) DVR, since they don't actually turn off at all.
3) Fans or air filters. My air filter was drawing 30 W and was on 24 hours a day. That's 21 kilowatt-hours per month.
posted by smackfu at 10:11 AM on February 12, 2010 [1 favorite]


I use these smart strips for my PC and TV. When the PC shuts down the strip kills the power to all the other devices plugged into the switching receptacles, so you can have it take down your monitor, speakers, printers, etc.
posted by IanMorr at 10:15 AM on February 12, 2010 [2 favorites]


Yeah, our refrigerator was also the single largest long-term drain - ours uses around 105 watts more or less continuously. You could try turning the temperature up a bit in yours to save power. I didn't test how effective that is, though, because it was already about as warm as it can be.
posted by Salvor Hardin at 10:18 AM on February 12, 2010


Anecdata: last year I was away from home for two weeks. Unplugged all home electronics while away. There was no noticeable effect on my monthly electric bill.

I rarely make heavy use of summer air conditioning, but when I do, that does make a difference. Otherwise, the fridge is the gorilla in the room, like others have said.

I'll note that other items in my house that heat things--oven, water heater, clothes dryer, furnace--are all gas appliances.
posted by gimonca at 10:35 AM on February 12, 2010


If you have a laser printer at home, they drain a ton of power (relative to other "small" appliances). My estimates from a kill-a-watt show that many of these draw 100w in powersave mode and draw north of 500w during use. But that's because they need to keep the fuser hot.

Deskjets and other printers don't have this "problem."

Anything that heats or cools are going to be the biggest draws of power in the home.
posted by zpousman at 10:37 AM on February 12, 2010 [1 favorite]


Check your computer monitor. My old one draws 12w as long as it's plugged in, even if it's switched off (not in standby). My new one (LCD) uses 38w in use, and less than 1w when in standby.

If you use a PC (and possibly a Mac too, but I have no experience), look into the level of standby it goes into.

If you can, keep your fridge/freezer in a cool room. It will make it easier for the appliance to stay cool inside. All of the heat in there gets vented to "outside" (the room it's in) so keeping the room cooler helps draw the heat out of the coils.
posted by Solomon at 10:47 AM on February 12, 2010


I would also suggest one of those meters. When I bought one and tested stuff, the electronics that are on all the time used tiny, tiny amounts

This jibes with my experience and experimenting. I question that the 40% was ever right, but I certainly do not believe it's true now. DVRs, possibly, but as was pointed out above, there's only so "off" you want them to really be.

I wandered the house with the kill-a-watt and not a single electronic device approached the consumption of even my CFL bulbs.

Yeah, our refrigerator was also the single largest long-term drain - ours uses around 105 watts more or less continuously.

Apparently the strives forward in power consumption on fridges is TREMENDOUS, to the point where when I was looking for a basement beer fridge a few years ago I was told by several sources to not consider anything that wasn't practically new - the difference in power consumption would supposedly eat up most savings. The calculator on the energy star website asserts that a pre-93 fridge consumes 3x the wattage of a modern fridge and a 93-2000 model consumes 2x as much.

The only dormant electronic device I'd consider worth powering up and down is a laser printer because of the periodic drum-spin. It may well be that modern ones (mine is 6+ years old now) don't even have that issue.
posted by phearlez at 10:47 AM on February 12, 2010


David MacKay, the chief scientific adviser to the United Kingdom Department of Energy and Climate Change, has written about this in his book, Sustainable Energy Without the Hot Air. He thinks, that while you can save a small amount by turning electronic devices off, the emphasis on unplugging them is a little silly:
One of the greatest dangers to society is the phone charger. The BBC News has been warning us of this since 2005:
“The nuclear power stations will all be switched off in a few years. How can we keep Britain’s lights on? ... unplug your mobile-phone charger when it’s not in use.”
For anyone whose consumption stack is over 100 kWh per day, the BBC’s advice, always unplug the phone charger, could potentially reduce their energy consumption by one hundredth of one percent (if only they would do it).
Every little helps!
I don’t think so. Obsessively switching off the phone-charger is like bailing the Titanic with a teaspoon. Do switch it off, but please be aware how tiny a gesture it is. Let me put it this way:
All the energy saved in switching off your charger for one day is used up in one second of car-driving.
He also has a table showing how much power various devices use when they are on or on standby.

The way I think about it, time and mental energy spent on unplugging electronics is wasted, and would be far better off spent figuring out how to move and heat stuff less (by far the biggest personal energy expenditures of any of us). Don't unplug your phone charger, put on a sweater and travel by private automobile less.
posted by grouse at 10:57 AM on February 12, 2010 [4 favorites]


I forgot the link to the chapter of Prof. MacKay's book and to add ellipses where I elided some of the text...
posted by grouse at 10:59 AM on February 12, 2010


The four top users of power in most homes are refrigerators, electric water heaters (if), lighting, and electric heating (if). Everything else is negligible by comparison.
posted by Chocolate Pickle at 11:33 AM on February 12, 2010


This is just from memory, but I recall an article mentioning that video game consoles use a lot of power while idle/off. Try a kill-a-watt meter on them. IIRC it was PS3 using the most, Xbox 360 in the middle, and Wii using the least.
posted by IndigoRain at 11:38 AM on February 12, 2010


Even if you are using gas heating, it will still take up an amount of energy that will dwarf standby gadgets.
posted by grouse at 11:39 AM on February 12, 2010


"40% of all electricity used to power home electronics is consumed while the products are turned off."

Note that this is stated strangely. It's not saying that 40% of electricity used in your home is used that way. What it's saying is that of the 5% of your electricity that's going to home electronics (with the other 95% going to heating, lighting, and refrigeration), 2% of that is consumed when the home electronics is off. (Numbers used for example purposes only, not based on any citation, and get off my case.)
posted by Chocolate Pickle at 11:56 AM on February 12, 2010 [1 favorite]


Pickle makes an excellent point. I think this is a vast, left-wing conspiracy. There is some idle drain on most devices that run on DC but it's quite small these days. By small I mean comically small relative to your fridge, washer, drier, HVAC, and water heater.

My brand new computer with a huge monitor is plugged into a power monitor. At full bore it can draw 200 watts or more, though that's rare. Usually it's under 140. When I put it to sleep (not off) it draws so little the meter can't measure it accurately. It's somewhere under 5W. The long term average turns out to be 38W.
posted by chairface at 12:07 PM on February 12, 2010


Smart strips rock. I put my TV arrays and my computer array on smart strips and saw fairly significant savings on my electric bill, around $4-6/month. And that's without all the hassle of constantly flipping the powerstrip on and off, which I never remember to do. :)

I tend to use electronics until they die a slow, painful death, so I had some older, far-less-efficient electronics on those arrays, but ... they need power strips anyway and $48/year is nothing to sneeze at.

(I realize this would be more helpful if I knew how much my electric bill was, but it's bundled with my gas bill, so the only time I paid attention to the electric portion alone was when switching to CFLs and putting in the smart strips, and I only remembered the savings, not the total electric cost.)
posted by Eyebrows McGee at 12:21 PM on February 12, 2010


I think this graph will shed some light on the subject. According to those figures, 72% of most households electricity consumption goes to, in descending order of size, heating, cooling, water heating, and refrigeration.

If you aren't going to do something about your dryer, water heater, refrigerator, air conditioner, and/or washing machine, seriously, don't bother. You'll maybe save your self a couple of bucks here and there at the cost of more time, money, and hassle than the savings is worth.

You stand to save yourself a lot more money by shifting your power usage than by trying to trim it around the edges. Many utility companies have plans whereby power is expensive during the day but cheap at night and on weekends. This helps them by stabilizing demand, which is easier on the grid. So instead of $0.067/kWh 24/7, I pay $0.124/kWh between 7AM and 9PM M-F, but $0.023/kWh nights and weekends. My power bill dropped from $80/month to $25-30/month, and I'm using just as much power as I was before. I just do laundry on the weekends and run the dishwasher after 9PM.

Helps that I'm single and have no roommates, I suppose...
posted by valkyryn at 12:39 PM on February 12, 2010 [1 favorite]


I'd really suggest popping $25 or so for a Kill-A-Watt, or similar. I'm generally with the people advising you to not worry about the small draw from electronic equipment on standby, but its still really interesting to see what *your* equipment actually draws.

I found that underclocking and undervolting the AthlonXP I was using as my home server made next to no difference on its "idle" power draw (~50W), and cut peak power draw by a trivial amount, especially since it cut peak performance by 30-50%. Even more surpising, I found that the AthlonXP system drew almost exactly the same amount at idle as the "low-power" via c3 system I'd been using.

My laser printer uses a lot of power when its printing, and a fair amount when it is ready to print, but it goes into idle-mode which goes into idle mode after about 10 minutes (and stays there most of the time), after which it draws only a few watts, so I stopped bothering with turning it on and off.

The dehumidifier in our basement draws a decent amount of power, but it only works out to a few bucks a month, a small price for keeping our basement from accumulating mildew.
posted by Good Brain at 2:45 PM on February 12, 2010


The five top users of power in most homes are refrigerators, electric water heaters (if), lighting, air conditioning (if), electric heating (if), and a ruthless dedication to the pope...
posted by Chocolate Pickle at 3:54 PM on February 12, 2010


I sprang for a Kill-A-Watt for various reasons including the subject of this AskMe. One of the things I found is that the power draw of my cell charger warts when not in use takes about a day to reach 1 Watt-hour. And yet the phone it comes with reminds me to unplug my charger when not in use to save all that electricity.

Computers and monitors can be an unexpected drain. Especially older computers nerds compulsively keep around and running. Or gaming rigs with 800W power supplies. Video gaming systems can also be a drain. The Wii for example, basically doesn't turn off during normal operation. Instead it turns off many of the peripherals and leaves the wifi running to check for mail and updates periodically.

These findings are widely reported, and you don't need a kill-a-watt to act on the findings. Turn off WiiConnect24 (red led instead of yellow), throw away the old computers and pick up a virtual private if you must, and set your screen saver to turn your monitor off. The only reason I can think of for someone to recommend you purchase one is because they have a used one they want to get rid of.
posted by pwnguin at 4:29 PM on February 12, 2010


Or gaming rigs with 800W power supplies.

The rating of the power supply has next to nothing to do with how much power it will actually consume. It will only supply as much current as the components it's connected to draw. So yes, it is true that gaming machines with fast CPUs and video cards do draw more power than lesser computers, and they have power supplies with large numbers on them, but until you've connected said machine to a meter you can say absolutely nothing about how much it draws. I would not be surprised at all to find that most people that buy 800 watt power supplies tend to use only 100-200 Watts, spiking up to 300 or so when playing a game.
posted by Rhomboid at 5:04 PM on February 12, 2010


Power supplies are generally inefficient when you're far away from their peak output. I haven't empirically verified this however, and I don't know of any studies off the top of my head.
posted by pwnguin at 5:11 PM on February 12, 2010


Some things to consider - not that it said "electronic" and not "electrical". Thing like light bulbs, refrigerators, hair dryers, toasters and power tools are not being considered in this statistic. What they're looking at are things that use some kind of step down transformer to go from 120V (or whatever your local power is) to 12V (or whatever they demand).

Grabbing a wall wart out of my bag of electronics stuff, I find it's rated for 12V and 500 milliAmps. Taking the equation Power=volts*amps I find that this little transformer will suck about 6 watts at it's peak. I'm pretty cynical about such claims and suspect that it probably never draws more than a watt or two and would die a horrible death after about 15 minutes at six watts, but YMMV.

Let's assume that with no load my wall wart pulls 1 watt. By contrast a 100 watt lightbulb draws 100 watts, a microwave or toaster about 1000 and a hair dryer about 1500 watts. If you have 20 little transformers like this, each sucking a watt all day long, that's less electricity that leaving a 100 watt lightbulb on for 5 hours or 30 minutes of toast making.

So yes, these little transformers are sucking electricity all the while they're plugged in. I'm not sure about the 60/40% numbers but since a lot of these things spend lot of time turned off, it might be true. But note that it's a percentage, not an absolute number. Most electronics don't use that much energy to begin with. Putting a lot of effort into unplugging your phone charger and and coffee maker (with that cute built in clock) might save you a buck or two a month, but if you put that effort into doing other energy conservation type things (notably things that will make your furnace and air conditioner run less) the payoff will but much larger.
posted by Kid Charlemagne at 5:36 PM on February 12, 2010


Power supplies are generally inefficient when you're far away from their peak output.

Not modern computer power supplies. To take a randomly selected example from an anandtech review of a 1000W-rated supply, the efficiency remained in a tight 81% - 84% band over the whole operating range except for the very low end where it dropped to "only" about 77%. And that is a year and a half old model. The newest ones are boasting figures of 90% or better.
posted by Rhomboid at 6:00 PM on February 12, 2010


In case you're not aware: it's really easy to read the electrical meter outside your house to determine your current rate of energy consumption. You don't need to wait for the electric bill to arrive, or for the little dials to turn.

Assuming the common analog meter here: measure how long it takes the spinning disc to turn once. If it's spinning fast, count the passing of the big black zero mark. If it's spinning slow, use the 100 ticks to extrapolate. Take 3600 times 7.2 (or whatever other Kh number is printed on the meter) and divide by the rotation time in seconds. Voila your current consumption in Watts.
posted by rlk at 6:06 PM on February 12, 2010 [1 favorite]


« Older Best time buy a snowblower? D...   |  Your favorite/best valentine/l... Newer »
This thread is closed to new comments.