English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

According to the Joule effect, any current passing a conductor, creates heat. But it's also true that electric currents circulate better through a heated conductor..
soo.. that would trigger even a higher heatlevel in the conductor...
aand.. that would trigger a stronger current.. aand...well, you get the picture.
All this should happen very fast, so, basicaly every wire should burn out in 10-20 minutes. My question is, what stops that from happening. Oh, and It would be *great* if you could point me to a page that also explains it.
Thank you in advance.

2007-05-03 20:46:14 · 4 answers · asked by anarki 1 in Science & Mathematics Physics

g4rts, well what about shortcirciutes then?
basicaly you're saying that a metallic conductor never melts down.

2007-05-04 02:08:33 · update #1

4 answers

In metallic conductors, resistance INCREASES with temperature, so you can create a heater by applying a constant voltage across a wire. However, semiconductors decrease in resistance with increasing temperature, and if you apply a constant voltage across them, their power dissipation will increase. That is why they are not used in heaters and incandescent lamps. Some semiconductors will exhibit a "thermal runaway" condition that will destroy it, sometimes in less than a second.

EDIT: Your initial question referred to runaway condition in which higher temps caused more heat generation. This does not happen in metallic wires under constant voltage. However, it is still possible for the wire to generate enough heat to melt itself if the voltage applied is high enough. For example, if I have a wire that has a resistance of 1 ohm, but only apply 0.1 volt to it, it will generate only 0.01 watt and will not get very hot. However, if I apply 10 volts to it, it will generate 100 watts and will burn up. (A complete calculation would have to take into account many factors such as the diameter and length of the wire.)

In the case of the short circuit, the resistance drops to a low value in a circuit of constant voltage. If I "short out" something, I am connecting a low resistance across the circuit. If the normal circuit "load" is (let's say) 10 ohms across 10 volts, the power is 10 watts. If I "short" that out will a one-ohm wire, the power goes up to 100 watts. The wire will burn out, or (more likely) the circuit could not deliver that much power and burn out.

2007-05-03 21:16:17 · answer #1 · answered by gp4rts 7 · 0 0

It depends on ratio of accumulation and dissipation of heat.

2007-05-03 21:06:45 · answer #2 · answered by Anonymous · 0 0

TTK is correct - see thermal runaway:
http://en.wikipedia.org/wiki/Thermal_runaway

2007-05-03 21:11:24 · answer #3 · answered by ZeroG2K 2 · 0 0

YES, IF YOU HAVE ACCESS TO A THERMONUCLEAR DEVICE.

2016-04-01 07:55:12 · answer #4 · answered by ? 4 · 0 0

fedest.com, questions and answers