It depends on the battery, and the charger.
If you have a 1.5V 2000mAh battery, and charge it at a constant current of 200mA, it will take 10 hours to charge.
mAh = mA x h, 2000= 200 x 10
Now, this process is not exactly 100% efficient, as some of the charge is lost by heat, but you will probably get 90% efficiency at C/10, which is capacity divided by 10. So take your number above, and divide by .90.
10 /.9 = 11.1
If you increase the charge current to say 400 mA, your time will not drop exactly in half, as the efficiency drops, as you produce more heat. At C/5, you are probably only 80% efficient, so rather than 5 hours, it may take 6.3.
If you are not charging at a constant current, you will need to chart the current over time. Typically a battery will charge at a high current initially, then drop down. If you chart the current, you can determine average current by taking a reading every minute. Simply take your minute readings, add them all up over 60 minutes, and then divide by 60 to get mAh. If you take readings only every 2 minutes, then do the same thing, but divide by 30. Yes, this could get boring, but the numbers get more stable as time goes by, and can be estimated.
2007-02-26 14:24:53
·
answer #1
·
answered by megaris 4
·
0⤊
0⤋