English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

how electricity do you think your average PC power supply uses in a month. How much would it raise your electric bill.

I have had 2 computers given to me and I already have 2 computers. I want to attach the new ones to the network, but If it raises my electric much, I want to pass on them.

2007-05-19 08:46:03 · 8 answers · asked by Mercury 2010 7 in Environment Green Living

8 answers

Todays computer power supplies usually don't actually shut down when shut off on the newer computers. They go into hybernate mode, but can still draw as much as 60 watts of electricity.

When I set up my computers, I usually set them so that they do not hybernate, but actually shut down. This can be done in the bios programming section of your computer, usually accessable by pressing either the delete, or escape key on many computers.

2007-05-19 09:03:58 · answer #1 · answered by Darqblade 3 · 0 0

One of the answerers above claimed a PC in use is like having on a 40watt bulb. Well, if your 40watt bulb draws about 150-200watts, I'd seriously ask for my money back.

I've checked a number of the PCs I've owned with a power-consumption meter, and they tend to use between 150-200watts. Power is measured in watt/hours, so a PC like this left on for an hour would use 200watt/hours of electricity. These PCs have had processors in the 1.5Ghz-2GHz range. If you're using a really fast processor and you do a lot of gaming with a fancy graphics card fitted, you could probably stick another 100-150watts onto that. A CRT monitor uses a fair amount of power too - my old 15inch one averages about 80watts. LCD monitors tend to be less power hungry than CRTs.

Leaving PCs on when they're not in use is not only a waste of money but obviously also an uneccessary contribution to global warming.

Another point to bear in mind is that electronic components have limited lifespans, usually measured in tens or hundreds of thousands of hours. Capacitors (you'll find plenty of these in your power supply and on your motherboard) have shorter lifespans than many other components, drastically shorter if they're run in warm or hot environments.

2007-05-20 07:27:09 · answer #2 · answered by lineartechnics 3 · 0 0

I know people who are in the computer business and they never turn their computers off. They say that it uses less electricity than a night light in stand by mode. When in use it is like a 40 watt bulb. The monitor eats the most electricity.

2007-05-19 09:07:19 · answer #3 · answered by my_alias_id 6 · 0 0

get a computer with energy star. it allows u to set when the computer should "turn off" without turning off. a whole lot of electricity is saved like this

2007-05-20 09:22:55 · answer #4 · answered by polakio92 2 · 0 0

Do you actually use all of them? If so, don't worry about the cost. Otherwise: simplify, simplify, simplify.

Conservation begins with each of us.

"I wanted to change the world. But I have found that the only thing one can be sure of changing is oneself." ~ Aldous Leonard Huxley : English writer & critic (1894 - 1963)

2007-05-19 09:11:11 · answer #5 · answered by Beach Saint 7 · 0 0

The computer only uses what it needs, so using a lesser one will not save power.

2016-05-17 14:41:11 · answer #6 · answered by ? 3 · 0 0

prob about a dollar per 5 hours each computer is on...adds up at end of the month.

2007-05-19 08:53:21 · answer #7 · answered by Keyan 3 · 0 0

idk

2007-05-19 09:16:15 · answer #8 · answered by janette_gb 2 · 0 0

fedest.com, questions and answers