English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

6 answers

Are you refering to power/utility transmissions?

The reason for voltage being stepped up to a high voltage before being stepped back down to 120 Vrms (in America, at least) when it reaches your house is to protect against line losses.

Say you have a microwave drawing 1200 W. At 120 Vrms, it draws 10 Amps. To make the math easy, if the voltage were stepped up to 1200 Vrms, there would only be 1 Amp flowing through the power lines until it reached your house (power is conserved: 1200 Vrms * 1 Amp = 120 Vrms * 10 Amps).

But power lines do have a resistance. The power lost in a power line can be given by I^2*R. So to give you 1200 W for your microwave, the power company would have to generate more power than what you are drawing to compensate for the power lost in the lines. Therefore, as they increase the voltage, the transmitted current is reduced, thus the I^2*R losses are reduced.

If the resistance of the power line was 25 ohms (to pick an arbitrary number), then 120V/10A would result in a power loss of 10^2A * 25Ohms = 2500W. The power would not even reach your microwave. If the power company generated 120kV/10mA, then the line loss would be (0.01^2)A * 25Ohms = 2.5mW.

2006-10-28 04:44:40 · answer #1 · answered by Anonymous · 0 0

dmb06851 it really is a good aspect. yet keep in mind I = V/R, because the resistance on your body is often consistent. and considering that we can anticipate the electric powered grid will be able to source a real looking volume of cutting-edge contained in the first position; the quantity of cutting-edge you acquire in an electric powered marvel will be depending on the voltage utilized. also, once you communicate about protection in words of sparking for the era of terminals, and leaping insulation those are commonly depending on voltage. Why 100 and ten? nicely, why 60Hz? Why is favourite railroad gauge 1435 mm? there is quite no rhyme or reason fairly 100 and ten-100 and twenty volts became chosen, it in all likelihood had something to do with a commerce off between protection, and the capacity to source genuine looking quantities of currents. that is all a count number of costs. The more effective voltage you've calls for more effective perfect/more effective insulation. yet, the a lot less voltage you've means that you want abode equipment with very low resistances in case you want any cutting-edge. someone decrease back contained in the eraly 1900's in all likelihood determined 110v became suitable and it became chosen as a commonplace because it became the most customary.

2016-12-05 07:50:39 · answer #2 · answered by Anonymous · 0 0

I'm not an electrical engineer, but the reason I was forced to memorize at some point in my education is that there is less energy loss with high voltage transmission. That's why the power lines transmit at higher voltage, requiring step-down transformers to get the 120 volts that is commonly used.

2006-10-28 04:32:41 · answer #3 · answered by WildOtter 5 · 0 0

WildOtter is correct. As a transformer steps up voltage, it steps down current, and as described by Ohm's Law for power, P = IIR or P = (V V)/R. (That's 'power is equal to current squared times resistance', or 'power is equal to voltage squared divided by resistance'.) So the power wasted by having to push electricity through the transmission lines is reduced if the voltage is high and the current is low.

Even so, it is estimated that twelve percent of all the power generated in the US is dissipated as heat in the transmission lines before it can get to the consumers.

If we were able to have superconducting transmission lines, the power savings would be equivalent to building dozens of new power generating plants across the country, without having to actually build them. This in turn would save enormous amounts of fossil fuels (coal, oil, gas).

We are not close to being able to have room temperature superconductors, but we do have high efficiency fluorescent lamps and light emitting diode lamps. 90 percent of the power going into an incandescent bulb is wasted as heat. If we were to replace incandescent lamps with power saving fluorescent or LED's across the country, we would also have surplus electrical generating capacity, and conserve coal, oil and gas.

28 OCT 06, 1149 hrs.

2006-10-28 04:45:24 · answer #4 · answered by cdf-rom 7 · 0 0

I am not electrical engineer but i believe that the main advantage of HV network is:-

Higher power transmission is achieved from one point to another which means requires high power source to drive through.

and Disadvantage:
Due to above this means equivalent higher power losses (especially during wet seasons) across HV lines which could result in burning poles if not insulated properly from supporting poles or if overpowered.
This also means high noise harmonics is generated - the effect your car radio noise interference you hear when passing under high voltage line.

HV lines also creates radiation which are harmful to humans - which is why electrical board have distance limits between HV lines to residence. More exposure time also can result in cancer.

Due to all above HV lines requires larger conductors and higher insulation in order to minimise power loss to atmosphere

2006-10-28 05:45:19 · answer #5 · answered by Mark 1 · 0 0

The ampacity of cables and bus bars is reduced for distribution especially for long distances.
Voltage drop to equipment location is less likely.

2006-10-28 04:42:36 · answer #6 · answered by java 4 · 0 0

fedest.com, questions and answers