English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

When electric power is distributed through the national grid, the nominal voltages are typically: 345, 138, 34.5, and 12.47kv. When voltages are higher then their typical nominal level, power can be saved from a reduction of line loss. Power saved = money saved. My Question: When increasing the voltage or conductor diameter from the standards today, what’s the maximum amount of money saved from minimizing line loss? Is it enough to be worth the investment of the cost of replacing higher rated equipment?

2007-03-26 09:39:45 · 3 answers · asked by Anonymous in Science & Mathematics Engineering

3 answers

The amount of energy lost is i square r. this electrical energy is wasted as heat energy.
to decrease i, use transformer to provide higher volatage distribution system. cost involved adding substation.
to decrease r, larger conductor size. to prevent sagging the poles distance has to be closer.
the utility company already calculated the effective cost.

2007-03-26 10:41:04 · answer #1 · answered by Anonymous · 0 0

Determining the cost savings due to reduced line losses with higher voltage would need to be calculated on a case by case basis.

Line loss is a function of the transmission line design (conductor size, spacing, and materials) as well as the size of the load, distance power is being transmitted, and transmission voltage. Transmission systems are designed to optimize cost vs. efficiency based upon these (and other) variables.

2007-03-26 19:13:26 · answer #2 · answered by snowcat16 2 · 0 0

Most companies that deal with these problems use computer programs to optimize their designs to satisfy both the engineering and economic requirements.
Electrical, gas and liquid transmission systems all present the same problems and depend heavily on computers to design the systems.

2007-03-27 01:35:15 · answer #3 · answered by gatorbait 7 · 0 0

fedest.com, questions and answers