Voltage drop, amperage increase?
July 17, 2013 1:05 AM   Subscribe

We had an electrical incident where I work a couple of nights ago, and I'm trying to wrap my brain around it. Our house engineers have maintained systems at 150 amps on legs with 200 amp fuses. The power company reported a power dip, shortly after which we blew two of the 200 amp fuses. What happened? We had a drop in available voltage, so our amperage demand shot up?

Also, if you could give me a title or two or refer me somewhere somewhat basic online to read more, I'd be ever so grateful.
posted by nevercalm to Technology (9 answers total) 1 user marked this as a favorite
 
One possibility that springs to mind is if the impedance of your system includes a large component of reactance, then once the input voltage tries to drop, the current doesn't just step down- it oscillates. If this happened, then at least one of the peaks of that oscillation could have exceeded the 200A limit.
posted by a snickering nuthatch at 1:13 AM on July 17, 2013


Response by poster: If it helps, most of the load on the system consisted of high-end, high-demand studio gear...monitors, servers, coms, audio, cameras and other assorted electronics. Little to none of it was anything like appliances, lighting or climate equipment.
posted by nevercalm at 1:27 AM on July 17, 2013


If you have switch mode power supplies (e.g. practically all IT equipment these days) then when the voltage drops they will pull more current to keep up their output. If your voltage sags a bit because of too high a demand then they all pull more current and you're boned.

Motors (or compressors or whatever) also pull more current when the voltage drops as they still try to do the same amount of work.

Good explanations here
posted by samj at 1:35 AM on July 17, 2013 [5 favorites]


Pretty much any computer, amplifier, or other electronic device with a modern power supply will draw more current when the line voltage drops. This is because the power needed by the device is constant (or at least not related to line conditions), and power = voltage x current. So if voltage goes down, current must go up to compensate.

Since line voltage varies by country, modern power supplies are designed to handle a very large range. Most devices will be happy with anything from 100-250V in, and will scale their current consumption inversely.
posted by ryanrs at 2:02 AM on July 17, 2013 [4 favorites]


A very simple explanation is that amps x volts = watts. Most devices that are more complicated than a lightbulb or a toaster are actually trying to draw a certain number of watts to do the work that they need to do. So if the voltage drops, they need to draw more amps.

Look at newer power supplies- they are built to take any voltage in the world. 100v - 240v. They will draw from the source whatever they need to output what they are required to output.
posted by gjc at 3:04 AM on July 17, 2013


As above, plus: you want more headroom on your breakers and feeds if you're blowing fuses on a dip. Suggest you consult your electrical professional.
posted by seanmpuckett at 5:04 AM on July 17, 2013


Response by poster: Thanks all.

Yeah, they added another 200 amp leg the next day. Luckily we're well funded, they were just trying to save money, as usual.
posted by nevercalm at 6:35 AM on July 17, 2013


What was your expected current draw (in normal conditions)? 200A should power a lot of studio equipment, especially if none of it was lighting.
posted by schmod at 9:14 AM on July 17, 2013


Response by poster: It's my understanding that they were running around 150 amps and the power dip was just enough to blow it.

The two 200 amp legs were powering most of the tech gear, which is a lot of stuff...the operating equipment, servers, graphics, edit suites, post, etcetc.
posted by nevercalm at 2:26 PM on July 17, 2013


« Older What's a good small-group organisation system?   |   How much do attorneys / law firms spend on... Newer »
This thread is closed to new comments.