English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-09-13 20:08:42 · 2 answers · asked by impactkom 1 in Science & Mathematics Engineering

2 answers

a) Connect a diode in series with the DC voltmeter.
b) Recalibrate the scale by multiplying the DC scale by 1.4

2007-09-16 15:56:08 · answer #1 · answered by mariskalen kampf Strudl v.Wurst! 7 · 0 1

Open the circuit in which the current measurement is to be made and insert a full wave (or 'bridge') rectifier with the AC terminals connected to the open ends of the circuit being measures. Then connect a low value resistor (typically 1 ohm or less) between the + and - terminals of the bridge. The AC current now flowing thru the circuit will always flow from - to + thru the resistor and (in accordance with Ohms Law) generate a voltage of V=I*R across the resistor. You can now measure this voltage with a DC voltmeter since its always of the same polarity.
In a perfect world, that would work fine. In the real world, you'd need to calibrate the hook up by putting known amounts of AC current thru it and noting the actual DC voltage(s) generated.

HTH

Doug

2007-09-13 20:31:54 · answer #2 · answered by doug_donaghue 7 · 1 1

fedest.com, questions and answers