Depends very much on your system. If we say 300 watts hypothetically, that's 2,400 watts for 8 hours, or 2.4 kilowatts.
Your electricity bill will give you the charge per "unit", which is kilowatts. Say it's, oh, 15p per unit. 15 x 2.4 = 36p.
If your system is all singin-all dancin and uses 500 watts (and that's very heavy usage), that's come out a 60p per day.
Hope that gives you an idea
2006-12-08 05:24:47
·
answer #1
·
answered by champer 7
·
0⤊
0⤋
It depends on the components in your computer, and on your monitor as well. At current average electricity prices, 8 hours a day on a mid-range computer with an LCD monitor sets you back between $.35 and $.50 USD. Higher range computers with bigger CRT monitors could run you upwards of $.75 to $.85.
2006-12-08 03:17:53
·
answer #2
·
answered by Mikkel 3
·
0⤊
0⤋
you'll have to determine how many watts it uses (found on the back of the tower and the monitor). Add them together. This is your wattage. Typically this number is also your watt-hours burned. Multiply this number by eight for your 8-hour watt hour usage.
Now, get your last electric bill and find out how much you are charged per watt hour. You are probably charged by kilowatt hour so you will need to divide by 1000 to find out how much it costs per watt hour.
Now multiply your per unit cost by your usage and you will have your answer.
2006-12-08 03:17:07
·
answer #3
·
answered by CPT Jack 5
·
0⤊
0⤋
It relies upon on the wattage score of the computing device's capacity source (it extremely is the area interior your computing device the position the cord from the wall outlet connects.) A heavy responsibility gaming computing device with a 500 W capacity source will cost extra to run than slightly iMac with a dinky capacity source. It also relies upon on your video reveal. A CRT (tube) video reveal makes use of extra capacity than an liquid crystal reveal (flat panel) video reveal. ultimately, it relies upon how a lot your community capacity agency expenditures for electrical energy. i am going to anticipate you've an common computing device and an common-priced capacity agency. it really is going to cost about $3.00 in line with month.
2016-11-24 23:04:24
·
answer #4
·
answered by loffelbein 4
·
0⤊
0⤋
The formula to figure out much you’re spending each year on using your computer goes like this: A watts / 1,000 = B kilowatts * C hours per day of usage = D kWh * $0.13 (rate you pay from electricity company per kWh) = E (cost per day) * 365 = F (cost per year).
Typical PC at home:
Computer running 4 hours per day with 17″ CRT on for 4 hours = $ 28.47 per year.
Computer running 4 hours per day with 17″ LCD on for 4 hours = $ 19.93 per year.
Typical costs:
*Computer running 24 hours per day with 17″ CRT on for 4 hours per day = $ 94.90 per year.
*Computer running 24 hours per day with 17″ LCD on for 4 hours per day = $ 86.36 per year.
In this post I’ll explain in detail how much electricity an average computer uses and how much money you’re spending to use your computer.
http://www.maximumpcguides.com/how-much-electricity-does-my-computer-use/
2006-12-08 03:18:36
·
answer #5
·
answered by arcaemous 4
·
0⤊
0⤋
We have no way of knowing this...
Hey I have an idea, why dont you use your PC for 8 hours a month only, and see how much your bill goes up?
2006-12-08 03:16:33
·
answer #6
·
answered by Danlow 5
·
0⤊
2⤋
That depends on a lot of things, what kind of monitor, what kind of computer, more specifically, processor, RAM, etc. Also, what peripherals are hooked up to it.
2006-12-08 03:16:16
·
answer #7
·
answered by Jordan R 2
·
0⤊
1⤋
I think that the amount of electricity used is to small to worry about
2006-12-08 03:15:52
·
answer #8
·
answered by Robert B 4
·
0⤊
1⤋
A trivial amount of money even for a monster of Computer.
2006-12-08 03:31:01
·
answer #9
·
answered by Anonymous
·
0⤊
3⤋