English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Is it better that 220V network or it is a matter of saftey??

2006-07-09 07:57:53 · 6 answers · asked by hisham_nazmy 1 in Science & Mathematics Engineering

6 answers

Obviously, there is sufficiently little difference in the big picture that both standards have survived in different jurisdictions. (In fact, there are more than two standards: there are places with 110, 120, 130, 220, 230 240 V nominal line voltage [generally +/-6%] plus both 50 and 60 Hz frequency standards. There has been a little progress toward increasing standardization, but it has been very slow.) The existence of the various standards has been largely the result of local politics and historical accident.

Roughly speaking, to operate a particular appliance requires a particular amount of POWER, which (at least for resistive loads) is current times voltage. If you double the voltage, you draw half the current to achieve the same power. The primary advantage of lower current is that you lose less power in the wires feeding current to the appliance (or you can use smaller, cheaper wires for the same power loss rating). On the other hand, the higher voltage is somewhat more dangerous if accidentally touched or if there is an accidental short circuit. Some experienced electricians are relatively casual about touching 110 V circuits, but all respect 230 V. (This constitutes a "don't-try-this-at-home thing, though--it's quite possible to get a fatal shock or start a fire with 110 V!) Current trends are toward the use of even lower voltages (24 V, 12 V, 5 V, 3.3 V...) for any devices which don't draw much total power to increase safety. Power is rarely distributed at these lower voltages; rather it is converted from 110 V or 230 V by a transformer at the earliest opportunity. Even in North America, 220-240 V is commonly used in residential appliances for most high-power electrical appliances (ovens, furnaces, dryers, large motors, etc.) so that the supply current and supply wire size can be smaller. Higher power industrial applications often use 480 V or more. And, of course, transmission lines use progressively higher voltages as the distance and total power go up (22,000 V for local distribution to 1,000,000 V for long distance lines).

For further reading, one good newsgroup discussion on the issue can be found at
sci.engr.lighting:

http://groups.google.com/groups?hl=en&threadm=6bg74c%24r7%
40cucumber.demon.co.uk&rnum=12&prev=/groups%3Fq%3Dpros%2Bcons%2B120%2B240%
2Bvolts%26hl%3Den%26start%3D10%26sa%3DN

2006-07-09 08:01:06 · answer #1 · answered by Eli 4 · 0 0

dmb06851 that's a good point. But remember I = V/R, because the resistance in your body is generally constant. And because we can assume the electrical grid will be able to source a reasonable amount of current in the first place; the amount of current you receive in an electrical shock will be dependent on the voltage applied. Also, when you talk about safety in terms of sparking across terminals, and jumping insulation those are primarily dependent on voltage.

Why 110? Well, why 60Hz? Why is standard railroad gauge 1435 mm? There's really no rhyme or reason why specifically 110-120 volts was chosen, it probably had something to do with a trade off between safety, and the ability to source reasonable amounts of currents.

It's all a matter of costs. The more voltage you have requires better/more insulation. But, the less voltage you have means that you need appliances with very low resistances if you want any current. Someone back in the eraly 1900's probably decided 110v was best and it was chosen as a standard because it was the most popular.

2006-07-09 21:01:42 · answer #2 · answered by Anonymous · 0 0

110V is the nominal voltage. It is normally running between 105-120VAC. You put two phases together (two hots), and you get the 220 (which is really 240VAC) that you find in most European countries.

Simple ohms law tell you that you use the same amount of power with both voltages. Using an electrical razor at 120VAC uses twice the amount of current as the same razor at 220 (240VAC).

The US adopted the standard mostly for safety purposes. You are less likely to be killed by a 110V line than a 220V. Although it is current that kills for the most part, and not voltage. (having been hit by the voltage of a Tesla coil once, I can personally attest to that!)

Eli provided an excellent answer - good one!

2006-07-09 08:30:57 · answer #3 · answered by Anonymous · 0 0

Actually, the power network in the US is 220V. It's just center-tapped at 110V. Center-tapping the voltage allows a degree of safety, not from the reduced voltage, but the fact that the center leg can be a neutral or at a relative 0V.

2006-07-09 17:32:05 · answer #4 · answered by Anonymous · 0 0

Jeremy M said

"It's to do with safety because it's voltage that kills"

On the contrary it is current which kills.

There is a very old adage which goes "it's volts that jolts but mils that kills."

In fact, about 4mA through the chest cavity is sufficient to cause fatal fibrillation.

2006-07-09 14:43:53 · answer #5 · answered by dmb06851 7 · 0 0

It's to do with safety because it's voltage that kills

2006-07-09 08:01:19 · answer #6 · answered by Anonymous · 0 0

fedest.com, questions and answers