How much electricity your computer uses depends on a variety of factors including the exact model CPU (a "prescott" Pentium 4 and a "northwood" Pentium 4 can use different amounts of electricity), the number of hard drives, the amount of RAM, the number of I/O cards (like your graphics card, network card, modem, sound card, SCSI card, etc) and what you're doing with it.
a TYPICAL computer sold today will use a constant 100-150 watts. OLDER systems (5+ years old) can actually use less (I have three systems, two that are about 6 years old and a new AMD athlon 64 X2 system and COMBINED they use about 300 watts).
If your computer uses 125 Watts on average, then you are using 125 watts x 24 hours a day x 31 days a month = 93,000 watts (or in electric company speak, 93 Kilowatt-hours - 93 kWh. I live in Long Island, NY, USA and we have (if I'm not mistaken, some of the highest rates in the country. That means for me, that 93 kWh would cost me about 20 CENTS per kilowatt hour, or 93 x .20 = 18.60 per month.
That's if you leave it on 24x7x31 (one month).
If you make the computer sleep, the usage should go down dramatically - probably to 20-50 watts per hour - which means if you have it "awake" for 8 hours a day and sleeping for 16 hours per day, even using the high estimate, this would cost you (in my area), about
8 hours a day x 31 days x 125 watts = 31 kWh PLUS
16 hours a day x 31 days x 50 watts = 25 kWh (actually 24.8)
Totaling 56 kWh at 20 cents per kWh = $11.20 monthly cost.
Now to REALLY get a handle on what it costs you, I would recommend getting a Kill-A-Watt meter for $20 - you plug it into the outlet and then whatever you want into it - it then tells you EXACTLY how much electricity you are using. (Note: leaving a standard lightbulb on 24x7 for a month costs almost $9 a month - if you haven't already, you might want to switch to the energy efficient bright LED or Fluorescent bulbs - they use 70% or less electricity (especially the LED bulbs - they can use as little as 1 watt).
http://www.smarthome.com/97314a.html
2007-01-08 12:42:09
·
answer #1
·
answered by lwcomputing 6
·
0⤊
0⤋
in accordance to the area listed lower than, "If left inactive, skill in call for human being qualified pcs enter a low-skill mode and use 15 watts or a lot less. New chip technologies make skill administration useful factors extra strong, reliable, and basic than even merely many years in the past."
2016-12-02 00:46:41
·
answer #2
·
answered by schiavone 4
·
0⤊
0⤋
I read somewhere that sleep mode does not use that much energy as a screensaver.
2007-01-08 12:35:09
·
answer #3
·
answered by sandiego_96963_denfish 1
·
0⤊
0⤋