English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I used 15 divided by (65 divided by 110) = about 27 bulbs of 65watts.

2007-01-22 14:37:06 · 10 answers · asked by Anonymous in Home & Garden Maintenance & Repairs

10 answers

I'm pretty sure that when a consumer-type bulb gives its rating it is the wattage when powered by 120v. So, after the start up surge, each 65 watt bulb will draw a little less than 0.55 amps at 120v. If you wanted to push your breaker and wiring to its design limits you could power 27 bulbs. But if your line voltage went to 125 (still within a lot of power companies specs) you would exceed 15 amps. Of course tolerance in the breaker itself (maybe +/- 10%) would outweigh most variations due to line voltage

The start up surge for a bulb can be very high but it reduces to the steady-state current fairly rapidly. Most circuit breakers can handle more than their steady state rating for a short time. So like one of the answerer's said - turning on all lights at once may trip the breaker. But possibly it would work. I'd be more concerned about running a circuit for a long period of time right up to its rating. My natural conservatism would agree with the answerer who said to try to stay below 75% of a circuits rating.

2007-01-23 07:07:20 · answer #1 · answered by Bryon W 2 · 0 0

From a math stand point using your values:

15A x 110V = 1650W

1650W / 65W = 25.38 bulbs

Using those values, 27 bulbs is a no go.

Additionally, there is an initial surge when the light is first turned on. This is due to the fact that the filament in the bulb is basically a short until power is applied. If you were to wire 27 65W bulbs and apply power to all of the bulbs at the same time, you would likely trip the breaker due to the large current surge.

Personally, I would recommend only using about 75% of the breaker capacity when calculating the number of bulbs to use per breaker.

2007-01-22 14:57:08 · answer #2 · answered by Anonymous · 0 0

Yes with 55 Watts left over based on 120V
Actually you multiply 15A by 120V = 1,800 Watts
and 27 x 65 Watts bulbs = 1,755Watts
I don't know where you come up with 110. If that was volts you would only have 110*15
1650 Watts
answer/65
25.3846 bulbs

2007-01-22 14:46:45 · answer #3 · answered by Anonymous · 0 1

I always used the guidline that 1 amp handles about 100 watts so 15 amps equals 1500 watts or about 23 of 65 watt bulbs. By using the new spiral flourescent bulbs, you can achieve 60 watts of light for only 14 watts of power. They can be found for about 1.50 each.
Save 75 % on your power bill.

2007-01-22 14:47:52 · answer #4 · answered by Anonymous · 1 0

Your computations seem sound to me. I did the following: 27 X 65, which gives you the total wattage (1755). Then, I divided that by 117 Volts RMS, and that gives the Max Amps. Good show!

2007-01-22 14:47:43 · answer #5 · answered by futurebtmfdr 2 · 0 0

You should never exceed 80% of the circuits ampacity which in this case is 1440 watts. Some of the trunk slammers out there will tell you different, but this is a good rule for any residential circuit. Code rules. Absolutely no more than 22 lamps on this circuit.

2007-01-23 10:42:14 · answer #6 · answered by electric_bob58 1 · 1 0

I calculated 25.

110 v * 15 amp = 1,650 W

1.650 W / 65 W = 25.38 bulbs

2007-01-22 14:49:01 · answer #7 · answered by pupunhao 1 · 0 0

By simple Math, Yes you could, But By The NEC No you can't, you can't use more than 80% of the Rated Branch Circuit, so the Most you could put on it would be 24.

2007-01-23 11:15:07 · answer #8 · answered by Ray D 5 · 0 0

One good way to find out. Get 30 light sockets, wire them up, and start screwing the bulbs in! When they all suddenly go out, you'll know you've reached the limit! That's verifying it the Mythbusters way!

2007-01-22 16:52:50 · answer #9 · answered by BuddyL 5 · 0 0

i say no. 65x27=1755 watts
divide by 110 volts=15.59 amps

2007-01-23 03:19:50 · answer #10 · answered by hdwasp59 2 · 0 0

fedest.com, questions and answers