English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

What is the average cost of electricity for a desktop for one month?

Using a 400 watt power supply, running 24 hours.

2007-01-14 04:32:07 · 5 answers · asked by NONAME 1 in Computers & Internet Hardware Desktops

5 answers

Just because the power supply is 400W doesn't mean the computer is actually using the entire 400W. You need to find out how much power your desktop actually uses...a typical desktop uses 65W. Check the first site listed for an estimation of the electricity cost to run a desktop. Check the second site for information on how much power a computer actually uses.

2007-01-14 04:40:43 · answer #1 · answered by Bored with Questions 2 · 0 0

It's estimated an Idle computer operates at 100 watts/73 kWh per month and costs an estimated $6

An active computer operates at 150 watts/110kWh per month and costs an estimated $9

You can measure the power usage of their computer using a power consumption meter.

Standby power can range between 10 and 15 watts, and occasionally beyond. On its own, this is not much. But if you get half a dozen devices on standby, it is the equivalent of a 60 watt bulb.

Use this website to calculate the exact power usage and estimate the cost per month:

http://michaelbluejay.com/electricity/computers.html

2007-01-14 04:59:34 · answer #2 · answered by Bonita Applebaum 5 · 0 0

Boy what a good question it relies upon on countless elements. what percentage Watts does the computer use? i do not keep in mind the formulation that are necessary to ascertain this i visit notwithstanding seem them up. at the same time as attending electronics college we finished those calculations on standard living house carry products which includes TVs, Toasters etc. I do undergo in ideas regardless of the reality that that a coloration television replaced into like .25 cents in holding with day to operate on a 24 hour foundation. With that in ideas i does not anticipate a computer to fee very a lot extra and maximum probable less than that. With immediately's switching skill resources and Flat exhibit screen video exhibit gadgets they could be extra effective to operate than instruments that the position produced say 10 or 15 years in the past. The visual exhibit unit truly if utilising a CRT will fee the most to operate!

2016-11-23 17:53:52 · answer #3 · answered by ? 4 · 0 0

I dont know the exact cost--but I do know that computers use different amounts of power depending on how much ram, size/speed of your processor and quite a few other variables. The fact that you have a 400W power supply doesnt mean that your pc is using all of the 400 watts.

2007-01-14 04:40:11 · answer #4 · answered by Nemo the geek 7 · 0 0

What powersupply you use shouldn't matter, since its not using that 400watts the whole time, and hardly ever will be.

Running 24/7 every day of the month, here in soutj California at least, it'll cost about 40$ for electricity, thats if the monitor is off the whole time.

It depends how much your company is charging you kw/h

2007-01-14 04:35:59 · answer #5 · answered by Anonymous · 0 0

fedest.com, questions and answers