simple mathematics
light bulb 60-100watts electricity/energy use
pc 350-450watts electricity/energy use also add on to this monitor useage and speakers lights in the room maybe a printer switched on and sitting idle which is going to be another 150-200 watts or so. so your dad is right your pc can possibly use up to 5 or 6 times more energy. this might not sound a lot at around 6pence per hour but add up all those hours in a day then a month and then a year.
5 hours per day at 6pence per hour = 30 pence x 7 = £2.10 x 4= 8.40 x 12= £100.80 which is nearly 50% of my yearly electricity costs. on daily basis this may not sound much but total it up for a year its a lot especially if your not the one that has to pay the electricity bill( taking into consideration you may live in mainland GB where electricity costs are some what cheaper than in northern ireland )where it would likely be £150.00 per year and that is just for your pc use think of all the other things in the house that are being used TV, dvd kettle, fridge/freezer washing machine tumble dryer all cost very little to run on an hourly basis but add it all up over a year can run into thousands. maybe your dad should consider energy saving bulbs they do cost less to run and you should spend less time on your pc. think of the benefits to the environment everybody wins in the long term.
2006-10-25 22:58:02
·
answer #1
·
answered by species8472 6
·
0⤊
0⤋
I'm guessing the p is for pound.....and last time I checked a pound was like $1.72 in usa dollars.
NO WAY it takes that much money, least in the usa, to light a light bulb for one hour. You might pay that for the entire month for one lightbulb...maybe if it was left on entire time.
A PC here in the states might use about 8 cents of power per day...there are 100 cents in a usa dollar so you do the conversion.
2006-10-25 22:06:23
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
for our american cousins, p is an abbreviation of penny (works the same way as dollars and cents)
best way to test is to get a plug meter from argos that measures how many kilowatts are used by a single plug (they are in the news all the time with the drive to kerb energy inneficiency) take the reading then work out the price form the rates set by your electric company.
as a fan motor is being powered, im guessing it costs more than you would guess
2006-10-25 22:14:41
·
answer #3
·
answered by enigma_variation 4
·
0⤊
0⤋
A computer consumes 70% more energy when the monitor is turned on.
A pc on its own running consumes less energy than one may think, but its all dependant on what you have your system running at.
Overclocking your processor uses more power, independant graphics cards use more power, USB ports consume a fair bit too.
Depends on your system, but i think its about for an average system about 10p an hour energy wise.
2006-10-25 22:06:39
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
Annual Energy Costs 24/7 - Off at Night = Actual Annual Energy Costs to run the computer.
Computer...... 39 - 9 = 30
Monitor 15"..... 54 -12 = 42
Laser Printer.. 44 -14 = 30
--------------------------------------
$102.00 Annually
365 days x 24 hours per day = 8760 hours
$102.00 / 8760 = $0.011 per hour
In English Pounds 0.005859 GBP per hour. - I don't know what the Pence conversion is.
2006-10-25 22:38:09
·
answer #5
·
answered by midnightlydy 6
·
2⤊
0⤋
Tell him it is the same as about 2-4 light bulbs, that is the easiest way I can explain it.
2006-10-26 11:50:14
·
answer #6
·
answered by mysticman44 7
·
0⤊
0⤋
good question but it depends on the prices of your energy provider :D
2006-10-25 22:13:34
·
answer #7
·
answered by liquiD 1
·
0⤊
0⤋
Less than a penny.
2006-10-25 22:04:14
·
answer #8
·
answered by Anonymous
·
0⤊
0⤋
not tht much its just like a telly
2006-10-25 22:05:47
·
answer #9
·
answered by Anonymous
·
0⤊
0⤋