Is it safe to keep a step-down converter turned on at all times?
February 17, 2007 12:09 AM Subscribe
I just picked up a 1000W step-down converter (220V-110V) to power a receiver after a 750W exploded in a blaze of glory when I turned it on one morning. I'd very much like to avoid this kind of thing again. So my question is, is it relatively safe to keep step-down converters turned on at all times? If so, will it constantly be using 1000W of power even when the receiver is turned off? I've heard differing opinions but found no definitive answer on the interwebs.
Related, is it true that you should always have something plugged into the converter when it is turned on?
If so, will it constantly be using 1000W of power even when the receiver is turned off?
Almost without exception, the amount of electricity any device uses is equal to the amount of heat it kicks out. If it was using 1000W, you'd know.
posted by cillit bang at 4:21 AM on February 17, 2007
Almost without exception, the amount of electricity any device uses is equal to the amount of heat it kicks out. If it was using 1000W, you'd know.
posted by cillit bang at 4:21 AM on February 17, 2007
the only reason i can think of not to leave it plugged in is the fact that it's a big inductive load; this will reduce your power factor.
it's possible that your local electric company bills a premium based on power factor, but i doubt it. i don't think they can usually measure it on a per-house basis, so i think those premiums are only charged to larger, industrial users (idea being, i guess, that the various capacitive and inductive loads in a block of houses will tend to average out).
also, things tend to break when their state is changed and you turn them off and on, and tend to not break when they're just left running for a long long time. this is why light bulbs that are left on all the time can last for years, but ones that you flip on and off all the time won't.
so, honestly, you're probably better off just leaving it on all the time. with nothing plugged in your electrical meter won't tick over in any meaningful way.
posted by sergeant sandwich at 8:58 AM on February 17, 2007
it's possible that your local electric company bills a premium based on power factor, but i doubt it. i don't think they can usually measure it on a per-house basis, so i think those premiums are only charged to larger, industrial users (idea being, i guess, that the various capacitive and inductive loads in a block of houses will tend to average out).
also, things tend to break when their state is changed and you turn them off and on, and tend to not break when they're just left running for a long long time. this is why light bulbs that are left on all the time can last for years, but ones that you flip on and off all the time won't.
so, honestly, you're probably better off just leaving it on all the time. with nothing plugged in your electrical meter won't tick over in any meaningful way.
posted by sergeant sandwich at 8:58 AM on February 17, 2007
As a google result said "Some mains transformers saturate at no load".
Saturation occurs when there is too much energy stored in an inductors magnetic field (an unloaded transformer is basically an inductor). When saturation occurs, energy losses from the magnetic field go way up, and the magnetization current goes way up too. This means a lot of heat and in extreme cases it could lead to a failure.
A transformer is different from an inductor because the secondary coil can steal energy from the magnetic field of the primary. If this energy sharing didn't happen, you would just have two inductors. When the energy sharing is taking place, the peak magnetic field can be substantially lower than it is under no load.
A consumer unit should be designed for safe operation at no load, because people normally won't know enough to be careful about this issue, but you never know..
Sorry, but I can't find a good ref for this phenomena yet.
posted by Chuckles at 9:26 AM on February 17, 2007
Saturation occurs when there is too much energy stored in an inductors magnetic field (an unloaded transformer is basically an inductor). When saturation occurs, energy losses from the magnetic field go way up, and the magnetization current goes way up too. This means a lot of heat and in extreme cases it could lead to a failure.
A transformer is different from an inductor because the secondary coil can steal energy from the magnetic field of the primary. If this energy sharing didn't happen, you would just have two inductors. When the energy sharing is taking place, the peak magnetic field can be substantially lower than it is under no load.
A consumer unit should be designed for safe operation at no load, because people normally won't know enough to be careful about this issue, but you never know..
Sorry, but I can't find a good ref for this phenomena yet.
posted by Chuckles at 9:26 AM on February 17, 2007
Also.. Even without saturation, a 1000VA line frequency transformer is going to use a lot of power under no load - while googling for the above I saw ~5% core losses, which don't decrease when the load is removed, that means ~50W - so a lot better to turn it off.
posted by Chuckles at 9:37 AM on February 17, 2007
posted by Chuckles at 9:37 AM on February 17, 2007
When the energy sharing is taking place, the peak magnetic field can be substantially lower than it is under no load.
Argh.. So that is wrong.. The problem being my rusty knowledge of transformers, and my trying to come up with a use less-technical explanation of how a transformer works..
Flux, which is the quantity that causes saturation, is strictly volt-seconds and windings. Under load, the flux density can be a little lower, because parasitic resistance can reduce the primary voltage, but it doesn't have anything to do with the power flowing from primary to secondary through the magnetic field. I believe there may be another mechanism that increases the possibility of saturation when there is no load, but I'm having trouble finding it..
posted by Chuckles at 10:38 AM on February 17, 2007
Argh.. So that is wrong.. The problem being my rusty knowledge of transformers, and my trying to come up with a use less-technical explanation of how a transformer works..
Flux, which is the quantity that causes saturation, is strictly volt-seconds and windings. Under load, the flux density can be a little lower, because parasitic resistance can reduce the primary voltage, but it doesn't have anything to do with the power flowing from primary to secondary through the magnetic field. I believe there may be another mechanism that increases the possibility of saturation when there is no load, but I'm having trouble finding it..
posted by Chuckles at 10:38 AM on February 17, 2007
This thread is closed to new comments.
An ideal transformer never consumes any power itself; it simply transfers power from its input to its output. If you leave an ideal transformer connected to a power source with nothing connected to its output, it will draw no power from the source.
Real transformers are not quite ideal, though some of them are very close. A real transformer connected to a power source, with nothing connected to its output, will draw a small amount of power on its own and might warm up a little. Similarly, a small percentage of the power transferred from input to output by any real transformer will be lost (converted to heat).
The difference between a transformer built to transfer 1000W safely to its output, and one built to transfer 750W, is that the 1000W unit will be engineered to dissipate more lost watts safely without getting too hot. Typically this is done by making everything bigger - the windings are made of thicker wire, and the core will be bigger.
I wouldn't expect to see much difference between a 750W unit and a 1000W unit when it comes to unloaded power draw. I'd expect either one to consume under ten watts when plugged in unloaded.
I can think of no reason not to operate a step-down transformer with no load, except for the small amount of watts it's going to be dissipating as useless heat even when you're not using the receiver. If electricity costs 14 cents per kilowatt-hour, a ten-watt loss running 24/7 is going to cost you about 14 cents every hundred hours, or about a dollar a month.
posted by flabdablet at 2:14 AM on February 17, 2007 [1 favorite]