Device charge time for chargers with different amps
June 12, 2019 6:37 AM   Subscribe

Given two chargers, one with roughly twice the amperage of the other, how will charge time be affected? Will the higher amp charger fully charge the device in half the time that the lower amp charger will take? Or does the charge time depend on something in addition to / other than amps?
posted by dancing leaves to Technology (8 answers total)
The maximum charging rate for the higher amp charger will be twice that of the lower one.

However, the device being charged will generally negotiate the charging rate with the charger, and will control the rate based on the capacity of the charger, the type of cable, the temperature of the battery, and so on.
posted by pipeski at 6:48 AM on June 12 [2 favorites]

There is a graph of real life charging times for various wattage of adapters about half way down this page:

It looks like it is pretty close to twice as fast for a twice as powerful charger, yes.
posted by richb at 6:57 AM on June 12

Assuming the battery can take double the amps (they have a maximum charge rate) it's still not going to be half, the beginning and particularly the end (10-20% or so) charge at less than the maximum rate to keep things healthy, although it varies by device. Also you may run into thermal throttling that reduces the charge rate during the middle to keep things at a reasonable temperature.

That's to say it's complicated, but you could guestimate it somewhere around half the time.

There are various devices to insert inline with your charger to report the actual current so you can see how things are really charging.
posted by TheAdamist at 7:50 AM on June 12

If double the amperage is actually delivered to the device then yes, roughly speaking it would charge in half the time.

But that's a big caveat. Different devices negotiate differently with different chargers. The official USB specification gave a maximum output current of 500mA, for example, and manufacturers have been playing games with that for a very long time. But the fact is that a phone that appears as more than a 500mA load can cause a USB port to assume it's been short circuited and either fail or shut down. So there has to be some kind of negotiation -- somehow the device learns that the charger is cool with heavier loads, and then the device adjusts to present a larger load.

On old iPods the docks could present analog voltages on the unused data lines to tell the device how much current they could deliver. These days I believe it's more of an active negotiation with a microcontroller in the charger. However I strongly doubt a generic $5 wall wart from Amazon has any brains in it at all, so a device would stick to 500mA regardless of the rating of the charger.
posted by range at 8:08 AM on June 12

I have an app called Ampere from Braintrapp on my Android phone. I used it to test several chargers and cables. Got rid of the slow ones.
posted by theora55 at 9:44 AM on June 12

given two chargers, the higher amp (or watt, which is just amps x volts) will charge faster up to the rated max amperage the device will draw. beyond that the higher amperage will not be used.

caveats: this holds for the output amps from the charger. the input amps are determined by the output amps and the charger's efficiency; some power is lost on conversion from AC to DC, and from one voltage level to another, with the waste turning into heat. generally, more efficient chargers cost more.
posted by zippy at 11:39 AM on June 12

also, the rate of charging (and amp consumption) may vary with the level of charge, decreasing as the battery becomes fuller, so it may not be true that a 2x amp charger will charge twice as fast, as for the last 20% to 50% of the battery's capacity the rate of charge may be the same as with the slower charger..
posted by zippy at 11:42 AM on June 12

« Older Surrealist detective novel ID   |   Sexual needs in a relationship Newer »

You are not logged in, either login or create an account to post comments