English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I've heard that it actually takes more energy to boot up a computer or switch on a lightswitch, etc, than it does to leave whatever it is running for an hour. Would I actually be saving energy by leaving my computer on all the time?

2006-11-15 17:34:43 · 8 answers · asked by Celia 3 in Science & Mathematics Other - Science

8 answers

What you need to consider are two different, but related, things about electrical equipment.

The first one refers to the "life" of a product. Any kind of switch (electrical or mechanical) has a limited number of uses in its "lifetime". This means the more the mechanism operates in a given time period, the shorter the overall life will be. All lightbulbs fall into this catergory as well as most microchip assemblies.

A simple example of this would be to flex a paper clip ... you do it enough times, the clip will break. If you do the flex once per day, it will last quite a while; flexing it as fast as you can will considerably shorten the time required to break it.

Secondly, some items (like lightbulbs and rotating equipment) draw more current during startup than they do during normal operation. The startup for an ordinary tungsten-filament lightbulb is virtually instantaneous (but is a measurable fraction of a second). A fluorescent bulb is longer (a few seconds), while a mercury-vapor lamp may take several minutes to reach full brightness. In each case, the inrush current (the amount drawn during startup) is only slightly higher than the operating current (about 10 percent higher). From this, we can calculate how long we can leave it on before it costs more energy to stay on than to switch off. (About a second for an incandescent bulb, up to a minute for a mercury-vapor lamp).

Rotating euqipment is slightly longer ... the inrush is typically higher (around 450-1000 percent of "running" current) and lasts longer (several seconds up to about a minute).

When folks are telling you to keep it turned on, they're not (normally) talking about conserving energy in terms of use. What they mean is the replacement cost of the device (from wearing it out during multiple stop/start sequences) is extremely high in terms of energy. THIS would be the energy you are saving by leaving things running ... and could equate to several hours of "idle" time, rather than multiple stops and starts.

2006-11-16 03:29:07 · answer #1 · answered by CanTexan 6 · 0 0

If it sounds like nonsense it probably is. And this is clear tosh.

A very small number of devices have a power surge on start up (most notably fluorescent tubes), but it is seconds or minutes before this equals the running power. It is always more energy efficient to turn things off, but it is not always practical or cost effective in the long term.

The real issues are (1) does the device operate properly if turned off (set top boxes, for instance, do not) and (2) does it shorten its lifetime (some PC components lifetimes are shorted if they are turned off and on a lot - so its a compromise and depends how long you will not be using the PC).

2006-11-15 21:23:30 · answer #2 · answered by Anonymous · 0 0

with the exception of fluorescent lighting fixtures, it is going to almost continually be a saving shutting issues off (not on standby). a controversy would nicely be made for pcs in hibernation or standby the position it can take a shorter time to renew artwork than reload each and everything. some equipment would take a time to warmth as a lot as operating temperature and for this reason fee extra to keep shutting down and restarting. An party is oil refineries the position it takes 3 days to correctly close down all and yet another 3 days to come again to regularly occurring use. it will be extra powerful and low-priced to run yet at a min cost each and each and every of the time. little doubt persons can provide many extra examples of this variety of project although the authentic answer is = regularly occurring shutting down usually will keep power.

2016-11-24 22:04:09 · answer #3 · answered by Anonymous · 0 0

It sounds like a myth to me. It is only industrial motors which take a surge of power to get started, and it would be more energy efficient to leave such a motor running, instead of continually stopping and starting it. A light bulb takes a few milliseconds to heat the filament up to the temperature of incandescence. There is a brief surge of current during that time, but the amount of additional energy consumed is negligible.

2006-11-15 17:40:59 · answer #4 · answered by Anonymous · 0 0

No, that doesn't make any sense. Say your computer uses 150 Watts and it takes 2 minutes (a definite exaggeration) to power up. In order for it to use an hour's worth of energy in that 2 minutes, it would have to draw 4500 watts. In North America, that's 37.5 amps (at 120 V), which would blow a breaker for sure.

2006-11-15 18:44:34 · answer #5 · answered by injanier 7 · 0 0

I've heard this too. But I was told its more like 5 minutes. You use 5 minutes of energy when turning the light switch on.

2006-11-15 17:40:22 · answer #6 · answered by GlooBoy 3 · 0 0

with the use of inrush current some things do, some thing sdon't,

2006-11-15 18:23:31 · answer #7 · answered by tim s 3 · 0 0

No.

2006-11-15 17:42:18 · answer #8 · answered by FrogDog 4 · 0 0

fedest.com, questions and answers