English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I'm an EE major and something has always bugged me in the (self-chosen) little-as-possible interaction with circuits that I've had:

How can a voltage source be high voltage, low current? More specifically, how can the output current be variable given a specific voltage? Is it not as simple as applying V = IR? Given the R of the load you can find I? If, for example, a power supply limits the current out, how does this not directly limit the voltage out, assuming the same load?

Supposedly power lines use high voltage, but low current to transfer power, and they say touching high voltage lines isn't dangerous, as long as the current is low...how is the current not determined by the resistance of your body??

Please help restore the foundations of my supposedly good education.

2007-12-07 13:31:04 · 10 answers · asked by R D 2 in Science & Mathematics Engineering

10 answers

Imagine a water pump. It has an input and an output. Inside it has small tubes and an impeller which moves water from the input side and the output side.

You connect a 2cm diameter clear pipe full of water to the input and output (making a closed loop). You turn the pump on. You notice that it moves 5 liters of water per second through the pipe. Great. Next, you try a 4cm diameter pipe, and the system now moves 20 l/s of water. Finally, you try a 8cm diameter pipe and you should expect to see say 80 l/s of water? Only you only see 20 l/s.

The reason why is fairly simple. Inside the pump it is using a 4cm pipe and it moves a constant 20l/s of water through the impeller.

Same with a voltage source. Say a AA battery. It is rated at say 1.2V and 2.5 amps. Similiar to the pump, it moves items (in this case electrons) through itself and outputs them when you form a circuit. But also like the pump, there is a limit to the rate at which it can move water (err... electrons) from one end of itself to another.

Similiar to the pump, you could put a big resistor across the battery. It will put out 1.2V and the current through the resistor will depend upon I = 1.2V / R. Use a 100ohm resistor and you get 1.2/100 = 0.012 amps. Use a 10ohm resistor and you get 1.2/10 = 1.2 amps. Ok. Use a 0.1ohm resistor and you expect to get 1.2/0.1 = 12 amps? Correct? Nope... you get 2.5amps. The reason is simple, like the pump, it can't move that number of electrons from the annode to the cathode. Internally, all power supplies have a minimum resistance. If you put a wire directly across a battery, the battery just heats up. That is because the battery itself is acting as the resistor. The heat typically damages the battery (and most power supplies).

In your EE courses they should discuss the built-in resistance which power sources have. This effectively limits their output of current. When designing electronics, you often need to ensure that the circuit has some minimum resistance usually several times that of the power supply.

And if you are an EE, it is not possible to avoid circuits. More than half your courses and most of your labs will involve them. Other than some math courses, a few physics, wave theory, electrical properties of matter, and a possible CS or ME courses, the vast majority of your courses (and nearly all the labs) will be about circuits.

2007-12-07 14:25:32 · answer #1 · answered by bw022 7 · 1 0

You miss the point.

High voltage is used, in many instances, to reduce the current requirement for power transmission. You need a certain amount of POWER. That same power can be delivered using low voltage and high current, or high voltage and low current. You want the low current option whenever possible, because current produces IR losses. Do the math. If you have the same amount of resistance (R), do you lose more power with a higher or lower current?

i.e., for a 1000W requirement (or load), you can have:

100v x 10A = 1000W or
1000V x 1A = 1000W.

But what have you gained? Assume the transmission line has a resistance of .1 ohm.

10A x 10A x .1 ohm = 10W loss (I^2 x R = W)
or
1A x 1A x .1 ohm = .1W loss.

The best bet is the higher voltage for a more efficient transfer of power to the load.

The demand from the power supply is determined by the load requirements. You cannot plug a 110v device into a 1000v outlet. I am not saying that. We are talking about transferring power from the power station to the end user. There are other considerations, such as safety, but that is the price you pay to transfer the power. You adjust for that.

I hope this helps.

2007-12-07 17:16:58 · answer #2 · answered by Warren W- a Mormon engineer 6 · 0 0

A voltage only attempts to maintain a constant voltage. The current out of a power supply is limited because the power supply is only cabable of supplying a pre-specified amount of power as determined by the components of the power supply. In other words THERE IS NO SUCH THING AS AN IDEAL VOLTAGE SOURCE. If the power rating of any voltage sorce is exceded, it will no longer act as a voltage source i.e. the voltage will drop (linearly or not, but probably not) with an icreasing load.

As far as the transmission lines goes, you can hang from a transmitting line all day long, but don't allow a path to ground or to another phase or the current deffinitely will not remain low :)

2007-12-07 14:02:18 · answer #3 · answered by SimplyMe 4 · 0 0

Most 'High voltage low current' supplies are designed around a load of almost nothing, as Bias or a static source.

You are thinking that most loads would be "rotated" between different voltage sources, and therefore the current would just follow the the I=E/R standard formula.

Not the case, but rather like the special bias needed for a Cathode Ray Tube, about 5KV, but current in the micro amps.

As far as dangerous current thru the body, the most dangerous is in the range of what the heart uses, about 5 to 15 miliamps, which confuses the heart muscles and causes heart attacks.

Significantly less = twitching, more severe = burns, but not the death causing heart attacks.

2007-12-07 13:52:19 · answer #4 · answered by duh 4 · 0 0

The amount of curent that can flow through a cable is usually limited to the size of the overcurrent device protecting them. The current is how much power going through the wire. The voltage is represented more of frequency at a specific resistance shown as a SIN wave graph. Power lines use higher voltages because the distance of long runs of wire cause the circuit to have a lot of resistance. The more resistance in a circuit causes the voltage to drop. However the amount of current stays the same.

2007-12-07 13:46:25 · answer #5 · answered by Noel 1 · 0 0

You are correct in your understanding of Ohm's law. If you touched a hot line on a power pole you could be fried.

When people choose to design for high voltage, low current they must design the load to be high impedance.

When a power supply limits output current, the voltage does drop.

2007-12-07 15:18:45 · answer #6 · answered by Tim C 7 · 0 0

R has nothing to do with Voltage & Current of a source. V=IR is just a formula that applies to metals & that too not accurately, so don't make Ohm's law your foundation in Electrical Science.

Current is the amount of charges flowing from one point to another. And Charges flow from a Higher potential to a lower potential, just like water flows from a higher point to a lower point.

Voltage is the Potential difference between these two points. Suppose the potential at Point A is 6 & at Point B is 4, then current will flow from A to B in order to neutralize the difference which is 6-4=2 or 2 Volts.

Always compare a wire to a water pipe, the thickness of the pipe to current capacity of the wire, Current with volume of water flowing & Voltage to speed of the water.

Resistance can be compared to friction which can affect the speed of flow of water but Friction or Resistance do not dictate terms as to what V & I or the Water speed & quantity are in the source, right?......All that R does is affect the process & not change the source.

And total volume of water to Wattage. So whether you have 6 Volts & 1 Ampere or 3 Volts & 2 amperes you still have only 6 Watts of power which you can manipulate to even 6000 Volts & one milliampere which again gives you just 6 Watts.

So if you have a real thick pipe with slow water flow & a thin pipe with jet speed water flow - You can still have the same volume of water flow per second.

So depending on your application you design the voltage & current characteristics of the wattage you are going to spend/transfer.

2007-12-07 14:59:30 · answer #7 · answered by PT 2 · 0 1

High voltage, low current doesn't apply to power transmission. Power transmission is usually in 100's of KVs and as much as 10's of KAs.

The application of HV, LA is usually used to operate CRTs, TV, power supply to operate transmitter etc. More typically, the voltage to jump over the gaps of the spark plugs to create explosing in the combustion chamber. The value is in the 10's of KVs.

HV, LA, look no further than the static charge. Some people even can create a spark at the heel when walking across a carpet in the winter time. That spark is high enough to ignite gasoline, it might shock people but not enough to burn anything.

2007-12-08 08:57:39 · answer #8 · answered by Eddie W 7 · 0 0

Are you sure about the current rating of the existing output? Rating of 0.1 mA is not a realistic figure. Probably it is a typing error and it is 0.1A. My answer is given below with assumption that the output is 30V - 0.1 Amp. If the final output has to be AC, then you have to have another inverter (transformer is ruled out in the question). If the final output has to be DC, then a chopper with inductor will provide the output of 10 Volts, rated for approximately 0.3 Amps. With invert, the losses will be much higher and therefore output current will be much less than 0.3 Amps.

2016-04-08 00:47:45 · answer #9 · answered by Anonymous · 0 0

423423r

2014-09-22 15:32:11 · answer #10 · answered by ? 1 · 0 0

fedest.com, questions and answers