Wattage of the heater ÷ 110v = Amps Needed
(example 1250 watt heater ÷ 110v = 11.36 amps) so one 1250 watt heater on a 15 amp circuit.
2007-10-27 13:45:19
·
answer #1
·
answered by Iam 2
·
0⤊
1⤋
It depends on the heaters and on the circuit. A 15 amp circuit can support a total of 1800 watts. A 20 amp circuit can handle 2400 watts. Look on the heaters for the rating. And don't forget, anything else already on the circuit must be figured in.
Forget that 80% rule, it does not apply to your general use receptacle outlet circuits.
2007-10-28 00:16:53
·
answer #2
·
answered by John himself 6
·
1⤊
1⤋
Take the total wattage and divide it by the voltage used will give you the amps needed to run the heaters. On a 20 amp circuit you can only use 16 amps and on a 30 amp circuit you can only use 24 amps. You are only allowed to use 80% of the circuit breaker.
2007-10-27 09:22:18
·
answer #3
·
answered by VADER 1
·
0⤊
1⤋
Check the AMP limit of the circuit and AMP draw of the heaters
2007-10-27 08:20:58
·
answer #4
·
answered by DIY Doc 7
·
2⤊
0⤋
Check AMP limit - usually heaters are 20 AMP - 30 AMP and most of the residential breakers are not more then 20 AMPS so its safe just to have one.
Get an electrician.
2007-10-27 11:20:44
·
answer #5
·
answered by kunjaldp 4
·
0⤊
1⤋
their is an average of 4 house fires a season do to electric heaters in this area. just one
2007-10-27 10:08:46
·
answer #6
·
answered by triminman 5
·
0⤊
0⤋
not more than one
2007-10-27 09:35:02
·
answer #7
·
answered by William B 7
·
0⤊
1⤋
not sure
2007-10-27 08:19:25
·
answer #8
·
answered by racheal m 1
·
1⤊
2⤋