English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I would be creating more voltage, less current, and less resistance. That would create more heat. How can I test the linear power supply safely without everything being so ****** hot.

2007-11-16 07:40:24 · 7 answers · asked by Infinite Resistance Ω 2 in Science & Mathematics Engineering

7 answers

A linear power supply typically refers to a power supply with a transformer, bridge, filter capacitor, and a regulator. The transformer is typically chosen for its output voltage and current rating. The output voltage of the transformer is selected to provide the optimum voltage for the regulator operation.

The reason the transformer voltage selection is important is because, as the input voltage to a fixed regulator increases, the output voltage and current remains the same (for a fixed load) and the additional power is disipated as heat by the regulator. However, if the differential between the input voltage and the output of the regulator is too high, the efficiency of the regulator will go down generating more heat and reducing the output current.

Therefore a 12V regulator designed to operate at 1A will produce 12W. If you operate the regulator at half the power, you will have kept the voltage the same and reduced the current (double the load). The reduction in current reduces the heat disipated by both the load and the regulator.

At this point, increasing the voltage to the regulator will cause the regulator to have to disipate more heat while the output power remains the same up until the point when the regulator degrades (input/output differential is too high) and the output power will be reduced due to a reduction in current.

Now assume you are using a variable regulator (LM317 for instance). If you double the output voltage for a fixed load, you quadruple the output power. (5V for 50 ohm load draws 100mA and produces 0.5W, 10V for 50 ohm load draws 200mA and produces 2W). The only advantage is that your regulator input/output differential has decreased but the current through the device has doubled.

In neither case, increasing the voltage to the regulator or increasing the voltage of the regulator, is there a reduction in current or resistance in the normal operating range. Only when the input/output differential is too high is there a chance for a reduction of output current but input current will typically increase for a bit (increasing the heat disipated) until the regulator starts to shut down.

2007-11-16 16:18:05 · answer #1 · answered by Anonymous · 0 0

It would be relatively unusual for a linear power supply to dissipate more power with a reduced load, though if it were a shunt regulator (like a typical zener regulator) that would be the case.

I'm not sure I understand; "I would be creating more voltage, less current, and less resistance"
if you have more voltage and less resistance you will usually have more current, but perhaps you are talking about different parts of the circuit.

What are you testing for?
"How can I test the linear power supply safely without everything being so ****** hot."
sounds to me like the supply failed the test if it is so f'ing hot at half load.

2007-11-16 12:49:58 · answer #2 · answered by tinkertailorcandlestickmaker 7 · 0 0

Depends on what its hooked up to. The reason you have certain wattages is so the power supply can regulate a reasonable amount of power at once without surges, so though larger power supplies deliver more power at once, they don't "work" as much so tend to produce less heat. I had a computer power supply burn up because I had too many drives and things running and the supply wasn't big enough.

2007-11-16 07:43:46 · answer #3 · answered by Jason P 2 · 0 0

Since you are concerned with heat I will assume that your power supply includes a voltage regulator. If so, it probably runs cooler with less current draw. If you could connect a volt meter to the input and output of the regulator and record readings under different loads you could calculate which one generates the least heat (Volts X Amps)

2007-11-16 13:55:29 · answer #4 · answered by Tim C 7 · 0 0

1- No, only half the heat, no matter how you work it. Period.

2- Use fans - seriously. Unless you have access to an active load which will put the power back into the grid (we make them here where I work, up to 900kW - yes, "kilowatts").

That's why I mentioned earlier in your other question about engineering/running your load resistors such that they dissipate half their rated capacity or less. Resistors, which are designed to dissipate power, can run awfully hot without failing. Selecting resistors such that they dissipate less than their rated power doesn't mean you'll dissipate less power total, it just means that the het will be more spread out and easier to handle. Break out the fans, it's normal.

2007-11-16 10:21:06 · answer #5 · answered by Gary H 6 · 0 0

actual, i'm no longer entirely positive how hybrids get there warmth. in the experience that they do have a heater at the same time as operating entirely off electric powered potential, then they ought to have an electric powered heater equivalent to a hair dryer or an section heater. otherwise they use the nice and comfortable temperature from the interior combustion engine for the reason that electric powered vehicles generate very virtually no warmth.

2016-10-24 08:40:09 · answer #6 · answered by ? 4 · 0 0

Basically power is dissipated and results in heat. So if your power supply is going to dissipate less power than you should have less heat.

2007-11-16 08:53:19 · answer #7 · answered by nisaiz3000 2 · 0 0

fedest.com, questions and answers