English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

3 answers

Yes there are TONS of factors that come into play here

Even if you have a 400w power supply your computer could be using only 50w when its just running.

But computers do not use a whole lot, in my household there are 13 computers and monitors running 24/7/365, some under a heavy load and lots of power consumption all the time and my electric bill was never over $200 even in summer when its humid and 100 degrees outside and the AC is on 70. Usually its around $150. --- This is with every computer having at least a 19" LCD up to a 30" LCD.

Infact the computer I am using now has a 600w power supply and 4 22" Samsung monitors in a quad monitor array.

There are some gadgets you can purchase that monitor the power consumption of your computer and will tell you the cost if you imput what you pay per kilowat hour... but to the general user its not that important.

2007-11-29 08:47:18 · answer #1 · answered by Anonymous · 0 0

Most computers with power supplies below 500W will barely make a dent in your power bill. My 450W runs about 24$ a month to constantly run. This is also dependent upon the price per killowatt hour in your area.

2007-11-29 16:28:25 · answer #2 · answered by Anonymous · 0 0

there' s too many variables involved to be accurate ie:
Local hydro rates, running hardware, will the disks or display be always on?

2007-11-29 16:12:38 · answer #3 · answered by derdaca 2 · 0 0

fedest.com, questions and answers