English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Hi, on my power supply, when i set it to put out less voltage compared to what the load needs, the current increases, and vice versa...but according to Ohms law, if the voltage decrease, V1-V2?R is smaller, thus less current, confused, can someone explain?

2007-02-20 11:08:37 · 2 answers · asked by a a 1 in Science & Mathematics Engineering

2 answers

Your power supply is compensating for it's load. It is doing this automatically. You apparently have a fixed load, and the supply was designed to provide a certain power to that load to satisfy the internal regulators of the power supply. If the load needs, let's say 12 volts at 1 amp, but you set it to 10 volts, then the internal regulators will increase the current being delivered to 1.2 amps to satisfy the needed 12 watts to the load. The same is true if the current level is set to 0.75 amps, the voltage out rises to meet the wattage load needed by the load resistance. To meet the demands of the formula that you gave, you need a fixed output power supply. One that will provide, again, let's say 12 volts at 1 amp, and that is all that you'll get out of it. Then, your formula will work. Using an automatic, compensating power supply is just going to make the formula look wrong, and drive you nuts trying to figure it out. If you can, turn off the automatic tracking function of the power supply.

2007-02-20 16:20:01 · answer #1 · answered by Anonymous · 0 0

Sounds as if you're driving some sort of 'active' load (such as another power supply). If that's the case, the total power required by the load is a constant, and the product of the voltage and current required will also be a constant. So yes, as the voltage decreases, the current increases to keep the power constant.


Doug

2007-02-20 19:14:00 · answer #2 · answered by doug_donaghue 7 · 1 0

fedest.com, questions and answers