Put me in charge!
November 16, 2012 5:46 PM   Subscribe

Explain to me the technology of charging devices. Is it mainly about voltage? About current? How close to the "required" values must you be?

Sometimes it takes a long time to charge a device. Sometimes it takes less time for the same device. What is it about charging via USB that makes it slower (if, in fact, it is) than charging it with an adapter that plugs into house current? Why can things be overcharged? And how can some things never overcharge (or so the manual says)? What did my phone detect when it displayed a message that said "this cable is not recommended for charging this device" when I attempted to charge it with a cable that looked to me just like all the others?

Lastly, my cordless landline phone sits in a cradle to charge it that says 9 Volts 150 mA. I assume some of that is to detect whether it is in the cradle or not and the rest is to charge the battery. How far off of these values can I stray (since it no longer works, thanks to the recent flood) with out it being ruined?
posted by Obscure Reference to Science & Nature (8 answers total) 12 users marked this as a favorite
 
Best answer: Wow, this is kind of broad.

I'll answer a couple of specifics, your last question first. Your phone requires a charger of 9V, with the capacity to deliver at least 150mA. The charger could be capable of delivering 300mA, 1A, 600A or whatever, and it will still work, with no issues. (For the purposes of this discussion at least, I'm ignoring a couple of reasons you wouldn't use a massively over specced charger.) On a cheap cordless phone you probably can't stray too far off that 9V rating however. I'd bet on it working on 8V and 10V, but much higher than that and it may get damaged, much lower and it may simply not work (or get damaged, if it's very poorly designed).

So, with ratings like these, you must ensure a charger can deliver at least the rated current. and has a voltage that is the same as the voltage on the device. It is the device that determines how much current is actually delivered, not the charger. All of this assumes a charger with a regulated output voltage, which is by far the most common thing you will see these days. A charger with an unregulated output voltage needs to match the device exactly, or near exactly.

I am guessing that your phone detected a charging voltage that was outside spec. It has nothing to do with the cable, even if the phone manufacturer chose to put it that way.

I don't have time at the moment to give the full background physics to all of this, but if you google "ohms law" and read as much as you can you will understand this stuff pretty well. All of the statements above really depend on Ohm's law.
posted by deadwax at 6:06 PM on November 16, 2012


All USB devices are 5.00±0.25 volts. The question, then, for charging purposes, is how much current you get. A big part of the problem is that the way devices use USB is often at odds with the actual charging standard. Many "USB" devices, like coffee warmers, fans, and lights, that have plugs that are USB-shaped but draw current willy-nilly and therefore are not, technically USB-compliant.

Originally devices could only draw 100 milliamps (.1 A) without contacting the host, but recently that has been increased to 1.5 A. 5V * 1.5A = 7.5 watts, though some chargers, like with the newest (4th generation) iPad now can produce up to 12W. Unfortunately, other manufacturers (like Motorola) use different, incompatible resistors to signal their non-standard charging currents. So an iPad and a Droid RAZR will both complain about the other's charging cable!
posted by wnissen at 8:36 PM on November 16, 2012


To be simple: Try to match the voltage fairly closely, but you can use a higher amp (or milliamp) usually without risk. It is almost always necessary that you match the polarity (the + goes to + and - to -).

To complicate it a little, you can sometimes use a lower current (the name for what amps is the measurement unit for) and it will charge, just slower, but not always. You can't damage something by trying a lower amp/current power supply so it is worth a shot.
posted by bystander at 4:41 AM on November 17, 2012


All the above is wonderfully correct. In case you need it here's a couple of pieces of basic data:

Voltage = how "hard" the electrons push to get through the wire. If your device wants 9v and you give it 6v the electrons can't "push" hard enough to get in and do work. If your device wants 9v and you give it, say 20v, the electrons push sooo hard that they can break the things in your device that they push through

Amperage (in milliamps in these situations) = how many electrons in a given unit of time move through the wire. If your device wants 1500ma and you give it (at the right voltage) 750ma, depending on its design it will either charge slowly or not at all but is unlikely to get hurt. If you offer it 3000ma, and again, if its well designed it will only use 1500ma of the available current.

As stated by deadwax above, if it wants 1500ma and you give it, say 2AMPS, you can hurt it because then it simply can't deal with the amount of extra current you are offering it.

Watts is volts X amps. It's a measure of how much work can be done. High voltage and low amps can produce the same work as low voltage and high amps if the ratios are right. The more watts used, the more work done per unit of time. Think of lightbulbs. A 100 watt bulb does more work (produces more light) than a 40 watt bulb. Same with chargers.

Hope that's useful
posted by BrooksCooper at 7:36 AM on November 17, 2012


Response by poster: So, let me see if I understand what you're all saying.

Charging is a matter of voltage. Amps, above the minimum, may determine how quickly things charge, but even that could be too much at some point. (I'm a little confused because BrooksCooper says 2A would be too much but deadwax thinks that's OK.) "Not recommended" probably was a voltage issue. I already knew the mathematics of this (ohm's law, power as volts times amps) but only for answering questions on an exam. I didn't know they applied to real life.

So, why will some rechargeable devices overcharge if you leave them plugged in too long while others won't? Also, is "too long" a matter of minutes, or of days?

And will too many amps make my phone appear to be in use when it's not, or vice versa? (I can't seem to find a 9 volt adapter below 500 mA)
posted by Obscure Reference at 9:32 AM on November 17, 2012


Best answer: All of the above is true, but as smartphones and other consumer devices get more and more complex, there are some other factors that come into play on the charger side. These more-stringent requirements are (some of) the reason for charger authentication, which causes messages like the "This x is not recommended" one that you saw.

It's true that devices are generally happy if they have the right DC voltage, but sometimes it depends on what you mean by "the right DC voltage".

Ideally, the voltage from the charger -- between the charger's positive terminal and its negative terminal -- will look like this. There is no change in the voltage with time.

But, of course, the voltage between the two prongs of your wall outlet is AC, like this, and it's always changing with time. Furthermore, the average value of that voltage is zero (it spends equal time being positive and being negative), so even if we filter it it's not very useful to us.

Now let's take the first step through a typical "wall-wart" power supply, and rectify the AC voltage. If we do something called "full-wave" rectification, we will take the negative peaks of the AC waveform and flip them around so they're positive. Now the waveform will look like this. Notice that although we're not anywhere near the ideal DC from before, the signal now has a positive average value: if you squint your eyes and step back, it looks kind of like DC. For some very basic devices, this rectified waveform might be sufficient.

Finally, let's do a little bit of filtering on that rectified DC. If we put a capacitor across the output to act as a sort of charge reservoir, our waveform will begin to be smoothed, like so. Now the rectified voltage looks even more like DC, and it doesn't dive down nearly as much between peaks. We can measure how much this signal falls down between peaks, and we call it the ripple. More filtering will reduce the amount of ripple that's present at DC, but more filtering costs money.

That's a very simplistic model of a power supply, and there are other components in most wall-warts. And the power supply for your cell phone is most likely a switch-mode power supply, which creates a smaller DC voltage by "chopping up" a high voltage, turning a switch on and off very quickly until the output voltage (after filtering) looks a lot like the desired DC voltage. Unfortunately, this switching generates other noise in the output signal, and this noise won't be at the 60 Hz or 120 Hz that was the ripple from the wall-wart-- it'll be up in the KHz range. We call distortion that happens at a certain frequency harmonic distortion, and since that switching noise happens at a higher frequency, it's higher-order harmonic distortion.

For a relatively simple device like your cordless home phone, almost any charger rated at the correct voltage and current will do. There's probably not much in the way of complex circuitry between the battery and the charger, and the phone won't really stop working if it gets a sloppy input voltage. This is a good thing. But for a more complex device like your iPhone, a poor-quality input voltage can really cause things to mess up. A common effect of this poor-quality power is capacitative touchscreens breaking [1]. This is the reason for charger authentication schemes: when a user plugs their phone into a USB port, they aren't going to be considering the power quality at that particular USB port- to them, a port is a port. And when the screen on their device stops working, they're not going to blame the charger, they're going to blame the device. Apple and other OEMs try to "authenticate" what they're plugged into to ensure that high-quality power will be provided.

I've simplified a lot of things, and I'm sure I've left out something important, but those are some of the important factors that make charging complicated nowadays. If you want more information on power supplies and contemporary cell-phone charging, you should check out Ken Shirriff's blog:
Apple iPhone charger teardown: quality in a tiny expensive package
Tiny, cheap, and dangerous: Inside a (fake) iPhone charger
A dozen USB chargers in the lab: Apple is very good, but not quite the best

[1] Noise Wars: Projected Capacitance Strikes Back, via the seriously awesome Ken Shirriff posts mentioned above
posted by aaronbeekay at 9:49 AM on November 17, 2012


Best answer: Charging is a matter of voltage. Amps, above the minimum, may determine how quickly things charge, but even that could be too much at some point.

To use the electricity-as-water analogy: Think of current as the volume of water flowing through a pipe. Your device is a faucet.

If you have a charger rated at 2 A, that means it's able to supply 2 A: it's a big pipe. (Depending on what you're charging.) But if you charge a device that only "wants" 500 mA -- you only open the faucet a little bit -- that's okay too, you're just not using everything the charger can do.

It's true that at constant voltage, increasing the current will increase the power delivered- that's P = VI, as you mention. But the device is the one that makes the decision on what kind of current it receives, not the source. In "dumb" devices, this decision is made based on physical parameters, like how much resistance the battery inside has. In smarter devices, like your phone, the device is able to request a certain amount of current, and it's able to shut off current when it's done charging or if the charger is the wrong type or whatever else.

This is why some devices "stop" charging, and some devices just hang out forever. Whether devices are harmed by overcharging depends on what kind of battery is inside them: for example, lead-acid batteries (cars, some UPSs, etc) can tolerate a lot, including being "trickle charged" constantly when they're full, but lithium batteries (laptops, cell phones, other new-style electronics) have very specific requirements for maximum voltage and maximum charge that must be obeyed for safety. These things are questions of chemistry, but the point is that different devices will behave differently because they have different kinds of batteries. In general, electronics are designed to charge safely, protect the batteries, and be as cheap as possible, and hopefully in that order.

(I'm a little confused because BrooksCooper says 2A would be too much but deadwax thinks that's OK.)

That really depends on your device. Most consumer digital electronics run on 500 mA or less, so 2 A is on the high side of things (but only slightly). But in order to charge batteries as quickly as possible, some manufacturers are now building chargers that supply 1A or more- I believe the iPad charger supplies 2A.

In general, you can almost always use any power supply rated at or above the current rating of your device, with the same voltage rating. (Assuming the device isn't too picky, like the new smartphones I'm talking about above.)
posted by aaronbeekay at 10:13 AM on November 17, 2012


So, why will some rechargeable devices overcharge if you leave them plugged in too long while others won't?

It depends on the built-in smartness of the devices charging circuitry. A simple device will just connect the charging power directly to the battery with nothing more complex than a fuse in between, and perhaps have some sort of charging/fully charged indication. Leaving a device like this plugged in too long can result in overcharging.

More complex devices will have an internal charge monitoring circuit that will shut off the charging to prevent overcharging - it's still a good idea not to leave this sort of thing plugged in indefinitely though.
posted by HiroProtagonist at 8:47 PM on November 19, 2012


« Older Kid=1 Snow Pants=0   |   I'm not stupid, I'm just American! Newer »
This thread is closed to new comments.