Let us assume that the source voltage is 120V
Using the following equations:
P = I x V
V = R x I
P - Power
I - Current
V - Voltage
R - Resistance
To start, we assume that ONLY the 200 watt bulb is used,
the current will then be P/I = 200/120 = 1.67A
Next, we can calculate the resistance of the 200 watt bulb,
R = V/I = 120/1.67 = 71.8 ohms
Similarly,
When ONLY the 100 watt bulb is used,
The current is 100/120 = 0.83A
Therefore the 100 watt bulb resistance = 120/0.83 = 144.5 ohms
NOW, if BOTH the bulbs are in series, we have a combined resistance of 71.8 + 144.5 = 216.3 ohms
So, the current is 120/216.3 = 0.55A
In the case of the 200 watt bulb,
- Voltage = R x I = 71.8 x 0.55 = 39.49 V
- Power = I x V = 0.55 x 39.49 = 21.7 Watts
In the case of the 100 watt bulb,
- Voltage = 144.5 x 0.55 = 79.48 V
- Power = 0.55 x 79.48 = 43.7 Watts
As can be seen in the calculations above, the 100 watt bulb has more power at 43.7 Watts as compared with the 200 watt bulb which has only 21.7 Watts. Therefore the 100 watt bulb will be brighter.
2006-08-01 03:16:22
·
answer #1
·
answered by ideaquest 7
·
1⤊
0⤋
it depends on the resistance of the 2 bulb. Assuming that both bulbs have the same resistance. The 100 watt bulb will glow brighter because it needs less energy to grow to the max light intensity.
2006-08-01 09:46:48
·
answer #2
·
answered by ET 3
·
0⤊
0⤋
A small light bulb has more resistance than a large one, so that it will pass less current and consume less power:
P=VI=V**2/R.
However, when the two bulbs are connected in series, the larger resistance has the greater voltage drop across it, while the currents in the two bulbs are the same:
P=VI,
so the smaller bulb consumes more power and therefore glows more brightly.
2006-08-01 09:47:45
·
answer #3
·
answered by Duds331 5
·
0⤊
0⤋
If the bulbs are in PARALLEL, SAME POTENTIAL difference is applied.
For a given potential difference, power is inversely proportional to the resistance.
Since Power = V^2 / R.
The power of 200 W bulb (A) is twice that of the100 W bulb (B).
Hence, (the resistance of A / the resistance of B) = (1/2). {Inverse ratio}
If the bulbs are in SERIES, SAME CURRENT passes through the two bulbs.
For a given current, power is directly proportional to the resistance.
Since Power =I^2 R.
Hence, (power of A / power of B) = (resistance of A / resistance of B) = (1/2) = 0.5.
The power of bulb (A) (200 W) is less than the power of bulb (B) (100 W) when connected in series.
2006-08-01 23:41:10
·
answer #4
·
answered by Pearlsawme 7
·
0⤊
0⤋
You need to think in terms of witch one needs the least amount of power to burn bright.
It would indeed be the 100 watt light bulb. Even if 200watt bulb drew off most of the power the 100 would still get what i needed to burn at almost 100% of its rating.
Another way to think of it is: Watts mean resistance. Now the Electrical power supply will take the course of least resistance so what bulb would it be.
2006-08-01 09:54:06
·
answer #5
·
answered by jjnsao 5
·
0⤊
0⤋
Voltage drops across a series circuit, current is the same but voltage is not. Hence the 200 watt bulb would have a greater voltage drop.
Maybe this is your possibility.
2006-08-01 09:47:20
·
answer #6
·
answered by Scott M 3
·
0⤊
0⤋
Your teacher is right: the 100W bulb will be brighter.
A 200W bulb has a lower resistance (r=u2/p= 242Ohms) than a 100W (484Ohms). The 220V supply will be split 73V across the 200W and 147 across the 100W bulb.
Since both are insufficient to lit the bulbs fully, the one with the highest voltage across will be the brighter.
Remember your Ohms'law!!!
2006-08-01 09:53:56
·
answer #7
·
answered by just "JR" 7
·
0⤊
0⤋