English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Will room heater produce same heat as 220 V supply,tell me the reason

2007-01-16 17:46:56 · 6 answers · asked by saheb 1 in Science & Mathematics Engineering

6 answers

The power dissipated in an ideal resistor is

P = V² / R

where V is the voltage across the resistor and R is the resistance.

Thus, if the room heater had a fixed resistance, reducing the voltage to one half (110V) will cut the power to one quarter (500W).

However, the resistance of the room heater varies with temperature. Most electrical heating elements use Nichrome wire, which has a positive temperature coefficient. At 1000°C, the resistance of Nichrome is approximately 6% higher than the resistance at 20°C.

I don't know what the temperature of the heating element for your heater will be when it is plugged into 220V, but I can say that the temperature will be much lower when it is plugged into 110V because the power is much lower.

Then, because the temperature is much lower, the resistance of the Nichrome wire is lower. However, from the equation above, a lower resistance wire will raise the power somewhat. A reasonable estimate (also known as a w.a.g.) is for a 3% reduction in resistance. So plugging your 220V, 2000W room heater into a 110V supply will reduce the heat output to around 515 watts.

2007-01-16 19:15:07 · answer #1 · answered by Tech Dude 5 · 0 0

Amazing what kind of answers you get here.

At 220v for a 2000w load, current will be 2000/220 = 9.09 amps

At 110 v, current will be cut in half = 4.55 amps

Power will now be = V*I = 110 * 4.54 amps = 500 w

So, at new current of 4.55 amps, power will be cut to 1/4 of the original value.

Will it work, yes as long as it doesn't have a fan to push the air. The fan may have an issue with the low voltage and burn up. The heating elements will not burn out but probably won't produce enough heat to make you happy.

2007-01-17 01:57:04 · answer #2 · answered by bkc99xx 6 · 1 0

Most of the answers given are correct. What you have to keep in mind is that the limiting factor here is you only have 110V driving your heater.

People misconstrue devices and current. Its not what device you plug into a series of outlets, but how many devices you plug into the series of outlets. They all get about 115V, and thus draw an amount of current accordingly. When you hit the 15A limit the circuit breaker will trip.

2007-01-17 16:57:17 · answer #3 · answered by Christina 6 · 0 0

With 220v the power is 2000W. Means it has drawn 9.1 Amps.
I = V / R hence Resistance is 24.2 ohms. When Connected to 110V. line the Current will be 110/24.2 = 4.54 Amps. Power shall be I^2.R = 4.54x4.54x24.2 = 500 Watts. That means you will need one more blanket.
Bye.
Rohin

2007-01-17 03:00:55 · answer #4 · answered by Rohinton B 1 · 1 0

Since V(oltage = I(Current) * R(esistance) and R is constant. the current drawn will be only 1000 watts. since energy spent is half heat produced is half

2007-01-17 02:32:44 · answer #5 · answered by Sridhar MA 1 · 0 3

current will be drawn and the device will burn out

2007-01-17 01:52:42 · answer #6 · answered by blitzkrieg_hatf6 2 · 0 1

fedest.com, questions and answers