English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

3 answers

20 va / 24 v = .833 a

In reality you never want to run a transformer at it's maximum current. It will get hot.

2007-02-22 13:06:32 · answer #1 · answered by John S 6 · 0 0

It depends if the 20 VA (volt amps) are the input or output. It also depends if the AC source is 3-phase or not (for the power factor).

Assuming this is a small residential plug in device, and the rating is the output, you can use P=I*V...which means the watts are equal to the volts times the amps. The difference between VA and watts is just the power factor.

2007-02-22 15:33:03 · answer #2 · answered by Michael M 1 · 0 0

Depends on the resistance of the circuit.

2007-02-22 13:57:21 · answer #3 · answered by Bill J 1 · 0 0

fedest.com, questions and answers