English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-01-21 10:02:44 · 4 answers · asked by lee m 1 in Computers & Internet Hardware Other - Hardware

4 answers

Watts = voltage times current.

So, if you have a power output of 700 watts connected to a 110volt line, the Amps drawn are about 6.3.

This depends on the voltage of your line. You can express this as P=EI or P/E=I or in this instance 700/110 = 6.36

If you line voltage is 120v, then you are drawing about 5.83 Amps.

2007-01-21 10:10:13 · answer #1 · answered by Kokopelli 7 · 0 1

Well this depends. People often associate the wattage of a computer's power supply with the line in voltage of 120 volts. Which is not correct.

Each power supply has a transformer which can either raise or lower the line in voltage depending on how it is wired.

A transformer usually consists of a primary coil and one or more secondary coils of wire separated by a gap.

For a computer power supply, the line in voltage is normally attached to the primary windings. The transformer is also wired so that the 120 volts (or 230 volts) is dropped down to +12, and +5. Additional taps on the transformer provide -5, and -12 volts.

In a transformer if the voltage across the primary winding is lowered, then end result is more available current on the secondary winding. So the power (watts) on the primary winding must equal the power (watts) on the combined secondary windings.

In addition a torriodal transformer is used to help increase the amount of current available.

In addition, the output current of a computer power supply is determined by a modulation circuit. Most computer power supplies are switching power supplies and turn off and on about 200,000 or more times a second.

Now the other thing people are confused on is current (I). Power supplies do not put out current. The provide a voltage which is an electrical potential (similar to water pressure). When the circuit is completed, current flows. The amount of current flowing in a circuit is dependant upon the source voltage and the resistance in the circuit.

A power supply will state the maximum amount of current that a computer can provide based on the voltage and resistance and other properties of the circuit.

So based on all of this information, if you want to theoretically evaluate this you have to take all of the DC output voltages in a power supply (+12, +5, -12, -5) and multiply those values equallyuntil you get 700 watts. I came up with about 20 amps.

+12v x 20A = 247.2W
+5v x 20A = 103W
| -12v x 20A | = 247.2W
| -5v x 20A | = 103W

Total is roughly 700 Watts. This is the estimate on the secondary side of the power supply. For the primary side of the power supply you would have to measure the input current. There's too much circuitry to simply calculate this.

The answer to your question depends how the power supply is designed. Most will not provide an even distribute of power.

2007-01-21 20:13:39 · answer #2 · answered by Shawn H 6 · 0 0

Watts divided by volts = amps This is basic ohm's law.

700 / 120 = 5.833333

I guess you are in the US. If in another country you must use the voltage of the outlet you connect your device to. Example: Most of Europe and Asis use 230 to 250 volt systems.

2007-01-21 18:14:22 · answer #3 · answered by Anonymous · 0 0

about 4 amps

2007-01-21 18:06:20 · answer #4 · answered by Croatian Sensation 1 · 0 0

fedest.com, questions and answers