To alter the A/C voltage.
You step up the voltage to many thousands of volts as this makes transmission over long distances more efficient.
Then they are used to step down the voltage down to safer levels where it can be used in homes and offices.
2007-03-29 08:55:46
·
answer #1
·
answered by Anonymous
·
2⤊
0⤋
The power may leave the plant at 10,000 volts, just a number off the top of my head, and due to line-drop, the voltage may go down to the point where it can't be transmitted any farther. Then step-up transformers condition the power back to the higher voltage to continue the journey.
At the home level, transformers step-down the voltage so that it will be compatible with your electric appliances, lights and other do-dads.
Transformers can step-up or step-down power. And can convert it to D.C. and probably back to A.C. as required.
Without transformers, power would have to be transmitted at house current levels and there would have to be a power plant about every ten miles or so to keep a consistency in the voltage.
2007-03-29 11:15:28
·
answer #2
·
answered by rann_georgia 7
·
0⤊
0⤋
Electricity loses voltage and current as it is sent over any distance. To overcome this power stations generate electricity in a very very high voltage like 250, 000 volts this is slowly stepped down by transformers by a primary coil running around a secondary coil to lower the voltage and increase the current. This way the national grid can send power the length of the country. Transformers can also step up voltage. On the boilers I service transformers convert 240volts to 12,000 volts at low current to ignite the fuel. I hope this answer helps but i am no expert!
2007-03-29 09:00:41
·
answer #3
·
answered by kjk26 1
·
0⤊
0⤋
It turns out that it is much more efficient to transport electricity long distances at higher voltages. The common power line you might have outside your house may have an electric potential of 1000 volts. These high voltages are not very useful in everyday appliances which rely heavily on electric current. By Ohm's Law it is easier to see that higher Voltage means lower current. V = I R. A transformer is a device used to change the voltage of the electricity to either a higher voltage for long distance transport or a lower voltage for use in our homes.
2007-03-29 08:59:57
·
answer #4
·
answered by msi_cord 7
·
0⤊
0⤋
Power loss in the transmission cables is proportional to the square of the current (I^2*R). If you use a transformer to step up the power station voltage to say 425kV along the power lines, you reduce the current in the same ratio (1/2000 approx). This reduces the power loss by a factor of 4x10^6!. The premium to be paid at each end for the step-up/step-down transformers is about 5% loss of efficiency each.
The alternative is using super-cooled cables where the resistance effectively vanishes, or 'room temperature' superconductor materials for which your fabrication technology doesn't exist yet.
2007-03-29 09:07:33
·
answer #5
·
answered by troothskr 4
·
0⤊
0⤋
Because of several thing:
1: volt drop- if it was distributed at 240v the volt drop over a distance of just 1km would be massive.
2: If you use a high voltage, the current drops and therefore the cable size needed is much smaller. If distributed at 240v the cable size would have to be HUGE as the current draw would be vast.
Transformers step up and down the voltage over the national grid where needed
2007-03-30 00:05:29
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋
Because Robots in Disguise are more effective at managing electricity than humans.
Seriously - the electricity travels distances at a much higher voltage and amperage., The transformer steps it down to voltage suitable for domestic use.
2007-03-29 09:08:04
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
Power is volts multiplied by amperes(current).
So if you step up the voltage you can reduce the current flow along distribution wires.
You can then send power down a cable with minimum loss caused by wire resistance.Simple example.
Volts Lost along wire is current multiplied by resistance so if you fed a 100 amp current down a one ohm wire you would lose 100volts along the wire.Not very clever if your source is say 220volts.Your loss heats up the wire and is 10 kilowatts wasted.(100voltsx100Amperes.)
transform the source by step up transformer to 2,200
then current is now only 10 amps,loss is only 10volts.Heat lost on wire is 10x10=100watts.
By using calculations ,the best compromise can be achieved between costings of transformers and cable sizes to minimise distribution costs.
Thomas Edison could not transform his direct current
system other than by using very expensive rotary motor- generator sets.Consequently alternating current with relatively simple transormers was finally universally adopted.
2007-03-29 09:23:20
·
answer #8
·
answered by anthony e 2
·
0⤊
0⤋
electricity is distributed around in large cables carrying large amounts of electricity. Enough for everyone. A transformer is like a tap that only allows a little amount of electric to come down the smaller cables to your home. without these, your TV would get such a large amount of electric it would scream like a girl and its hair would curl.
2007-03-29 09:03:43
·
answer #9
·
answered by joesmokette 1
·
0⤊
0⤋
There are two reasons:
For any transmission line with cross-sectional area A and resistivity Ï,
V2 = V1 - 2IÏL/A
P(lost) = 2I^2ÏL/A,
so that supply voltage shows a linear decrease with line length, the slope of the decrease being proportional to the current, and the heating loss in the transmission line is proportional to the square of the current.
Disregarding power factor,
P = VI
I = P/V
If we only double the voltage the current needed drops to 1/2, decreasing voltage drop by 50% and power lost by 75% (1 - (1/2)^2).
Increasing voltage by a factor of 1,000 results in a 99.9% decrease in voltage drop and a 99.9999% decrease in power lost. (Practically, each transformer introduced carries a 3% loss of power, so the picture isn't quite that good.)
2007-03-29 09:55:40
·
answer #10
·
answered by Helmut 7
·
0⤊
0⤋