Your question could receive some very complicated answers but I'll try to keep it as simple as possible.
When you power a device there are two things that you need to keep in mind; voltage and current. The two are related but not identical (you can do a Yahoo search for Ohms Law to see the relationship). Generally the more voltage your device requires the less current it needs and visa versa. Wire size is usually determined by the current expected so if you use 220 you will need less current and therefore smaller wire (you don't need 2 X 110 wires either, it can come right from the box as 220). In effect by doubling the voltage you are halving the current needed. That makes it more efficient (up to a point).
Add to this the fact that higher voltages travel more efficiently (less loss due to resistance, again you are using less current) and you have a more efficient system.
Some electrical motors let you select 220 or 110, in this case your electrician will always choose 220 if possible for these and other reasons.
Even though devices powered by 220 tend to have thick wires that doesn't mean the 220 needs more copper, it only means that if you powered it by 110 it would need even thicker wire than the 220 version.
Remember, 110 or 220 can be very dangerous. Don't ever work with electrical systems unless you understand them completely and never, ever, try to power a 220 device by plugging in 2 X 110 wires.
I hope that helped.
2006-12-09 13:34:18
·
answer #1
·
answered by eric b 1
·
0⤊
0⤋
yikes so many smart people yet so many bad answers..
take a 220/110 volt motor rated at (10 amps220 and 20 amps 110volts..) sounds like at 220 your saving money right?
sadly though they all forget your charged basically on the actual WATTAGE used! thus at 220 volts times 10 amps gives you 2200 watts used....and the same motor at 110 volts at 20 amps gives you OMG! 2200 WATTS also!....
the reason we run at a higher voltage is for saving money in transmission of voltage..(the higher the voltage the smaller wires needed) same in your home...that same motor would require a 30 amp circuit(per code) and a number 10 wire...
and at 220 volts it would need a 15 amp circuit and only a number 15 wire(s).thats where your saving the money...
and while yes technically a 220 motor is a tad more efficient..
overall its the wattage rating that saves you money NOT the voltage...its all a big lie..what you hear otherwise
and oh yes almost forgot it your running something like an electric hot water heater or baseboard heat, there is NO differance whatsoever in total wattage used! Ther is no efficiency either way like in a motor...so when using resistive loads such as these(also incandecent bulbs) there is not even 1 % diferance in your cost
2006-12-10 04:34:23
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
When 220v enters your house it does so as two 110v wires. When you draw electricity from both wires at the same time through the same appliance it adds to become 220v. At the meter you are charged for the "high leg". If you run all your electricity at 110 through one leg and are using 1 kwh you will be charged for 1 kwh, but if you balnce the current between two legs in the same situation you will be charged for .5 kwhs because you are only drawing .5 kwh through each leg. Any appliance that runs 220v automatically balances its draw across both legs so you are charged for half as much as if you were drawing the same current on 110v.
2006-12-09 13:39:19
·
answer #3
·
answered by nathanael_beal 4
·
0⤊
0⤋
with out doing alot of math, ill try to explain it as best as i can.
the higher the voltage the more efficient an electric motor is. it has to do with stray magnetic eddy currents generated in the motor.
a 220 motor will pull a little less than half the amperage of the same horsepower 110 motor. a 480 volt 3 phase motor pulls about 1/5 the amps of a similarly sized 100 volt motor.
since the amp draw is less as voltage goes up, smaller wires can be used.
look at the trans former on the pole outside your house....the small wires going in are usually 7500 volt. the big wires coming out are 220 volt.....they are both supplying the same wattage, just at different voltages
hope we are clear as mud now,
Possum
2006-12-09 13:23:04
·
answer #4
·
answered by hillbilly named Possum 5
·
0⤊
0⤋
P=IE. 220v uses 1/2 the amps of 110.
2006-12-09 12:29:57
·
answer #5
·
answered by jack w 6
·
0⤊
0⤋
you additionally can ask why do we've 60Hz and europe 50Hz ? faster or later in time somebody made the alternative to unify the electrical powered device ! incredibly in europe you had a myriad of diverse voltages earlier WW2, the reason grow to be that each city and city had their own skill plant which provided the interior of sight industry, in the city the place i grew up there have been one area of the city which had 110V AC , yet another area ran on 110V DC and the 0.33 one on 220V AC !!! after WW2 got here to an end a complication-loose grid grow to be customary and the voltage grow to be set to 220/380V ! you may argue over the pros and cons all day long, the better the voltage the smaller conductors you opt for, shall we are saying you have a 2200W Heater, in case you run it on 220V you draw 10A of modern on an identical time as on 110V you draw 20A !!! in case you draw extra modern you opt for greater diameter cord which interprets into greater cost ! Altough many human beings argue that greater voltage is extra "risky" it won't be precisely the reality, its extra finding on how plenty modern flows via your physique and the course the present takes , i actuality decrease voltage is probable extra risky because of the fact that human beings underestimate the risk and grow to be careless !
2016-12-13 05:57:46
·
answer #6
·
answered by ? 4
·
0⤊
0⤋
Power loss is directly related to the resistance of the conductors, and the current moving through them. Because is is more difficult to lower the resistance, the voltage is boosted to improve efficiency.
As the first responder mentioned, for the same power, doubling the voltage, requires only half the current.
Because power loss is directly related to the current P=Volts * Amps ( in Watts ).
Having lower current levels, means lower losses due to resistance.
This is why high tension power lines are at a high voltage, sometimes as much as 750,000V.
2006-12-09 12:39:12
·
answer #7
·
answered by Austin Semiconductor 5
·
0⤊
0⤋
Generally, higher voltages do not result in less power being used. For example a 1/2 HP motor will use about the same amount of power regardless of the voltage.
However, the higher voltage means less amps and less amps means SMALLER WIRES. Smaller wires is where higher voltages save the most money.
Also, less amps means you can connect more devices on a circuit. For example fluourescent lighting in a building. At 277 V, you can connect more than twice the # of fixtures than you could at 120V. That means, smaller wires, less circuits, less breakers, less panels, etc...
2006-12-09 14:07:05
·
answer #8
·
answered by H_A_V_0_C 5
·
0⤊
0⤋
Because the electric company charges you by the kilowatt hour. Which is directly related to the amount of amps you use in that pay period. They dont charge for voltage. 220 uses half the amps 110 uses.
( I hope I didnt complicate the answer).
2006-12-09 15:21:36
·
answer #9
·
answered by dewhatulike 5
·
0⤊
0⤋
Its like trying to drive a large nail witha tiny hammer(110) you'll be beating all day. A large hammer(220) will drive the nail in a few blows,more efficient.
2006-12-10 02:18:53
·
answer #10
·
answered by axismiracle 2
·
0⤊
0⤋