The mA (milliamp) rating indicates the maximum amount of electrical current that it can supply. The VDC rating indicates the nominal DC voltage that it will run at. These are related in terms of the maximum power that can be supplied, but otherwise one cannot be determined from the other. A power supply that can supply 500mA could supply that current at 5VDC just as easily as 12VDC - just depends on how it was designed.
If you know the total power, you can calculate the voltage:
Power (Watts) / Current (Amps) = Voltage (Volts)
So, if the first power supply above is 2.5 watts of power and can supply 500mA current (which is .500A):
2.5W / .500A = 5V
You could also measure the VDC of your device with a multimeter - voltage is easy to measure as you don't need anything connected to measure it. Current is more difficult to measure as you must actually be drawing the current to measure it.
Good Luck
2007-10-09 06:00:59
·
answer #1
·
answered by TahoeT 6
·
1⤊
0⤋
Actually, you can do this. For example, industrial instruments (temperature sensors, pressure sensors, etc provide a 4-20mA signal to be used as an analog output. The current is varied up and down within the 4 to 20 mA range to represent the output of the sensor (pressure, temperature, flow, etc.) For example, if you place a 250 ohm resistor between the two leads connected to your device you will get a 1 - 5 VDC signal (1VDC = 4mA and 5VDC = 20mA). This is computed using ohm's law I = V * R (or current = voltage * resistance).
2007-10-09 13:02:04
·
answer #2
·
answered by ctleng76 5
·
0⤊
0⤋
you cant. mA is reference for amperage which is different that VDC output which is a reference for voltage. So you will have two ratings, voltage (ac or dc) and amperage (can be measured in amps or milliamps, 1000ma = 1amp) Good luck, you can email me if this doesnt solve your problem.
2007-10-09 11:43:02
·
answer #3
·
answered by Anonymous
·
1⤊
2⤋