It would be real easy to calculate the cost of leaving one light on for a period of time.
Let's say you have a 60 watt bulb and you need to use it for 4 hours a day, but you are too lazy to turn it off and leave it on for 12 hours every day.
That's 8 wasted hours *0.06kw*30days*0.08 cents = $1.15 per month per 60 watt light bulb and assuming 8 cents per kWH is your rate.
So, each one would cost an extra $1.15 a month to burn 8 hours needlessly.
The hard part to quantify for you would be how many bulbs are you leaving on and how much of that time would be considered wasteful. I would think that for someone not doing a good job with turning lights off could leave on approx. 5-6 bulbs per day at 8 hours of waste.
That could then equal $6 - $7 per month. So, I wouldn't automatically accept that your habit is inconsequential to his.
Now, if he leaves his window open all day, the average effect would probably be at least 1 additional hour per day of run time on your heating system, probably a lot more on cold days.
So, assume 3 kw of heat energy * 1hr * 30 days *0.08 = $7.20 per month.
There would be many more variables that would effect the cost of leaving the window open, but by assuming only 1 hour of additional heating, I believe any further refinement of the calculation would probably increase the costs of the window situation even more. For example, it could take an additional 10kw of heat per hour of run time and that would triple his expense. But, if he shuts it between cigarettes, then that would reduce it considerably.
So, in total, I believe that the window open would usually use more energy if left open all day but both would be considered wasteful.
Your money is a resource and should be managed. For your situation, would you buy a coke and pour 16 ounces down the drain every time you poured an 8 ounce glass for yourself? Of course not. Then why would you do that with electricity?
For him, would you buy a car and then connect a 1000 lb weight on it that reduced the gas mileage by 10-15%?
So, don't be stubborn. You should both encourage each other to use electricity wisely while still maintaining your friendship.
For an example, I once knew some roomates that were arguing because one was using a curling iron each morning and would let it heat up for 15 minutes. The other person thought that was excessive and causing their high electric bill. I did a similar calculation above and told them they were arguing over less than 8 cents for the entire year they would be together. I handed one person a dime and told him to use whatever he needed to pay the excess and that he could keep the change.
2007-02-10 07:21:56
·
answer #1
·
answered by bkc99xx 6
·
0⤊
0⤋
What you can do is this; the formula below shows how much energy is used by a 60 watt bulb:
A 60-watt bulb used for 60 hours would use one (1) kilowatt of electricity.
You need to look at your electric bill and find out what the usage AND delivery charges are for 1 kilowatt of electricity and calculate the cost of leaving that bulb on for, say, an extra hour a day for a month.
1 kilowatt rate divided by 60 = dollar cost of 1 hour of usage.
Now multipl that by 5 for the winter months (since we're going to look at heat next.
It costs the AVERAGE homeowner $500 a year for heat loss through a fireplace - not an open window, just a fireplace. Let's be generous here and assume that your open window is sucking out $100 a month.
There - now you can do battle
2007-02-10 07:00:53
·
answer #2
·
answered by PamV 3
·
0⤊
0⤋
Get another roommate or a prescription for Valium. It's not even provable that turning lights off when leaving a room results in monetary savings unless you aren't the one buying light bulbs, and/or you leave the light off for at least 1/2 hour before turning it on again. You can get pretty good estimate of the cost of heating the flow of air through the open window in the winter, but it's a waste of time. You won't be believed.
2007-02-10 07:04:12
·
answer #3
·
answered by Helmut 7
·
0⤊
0⤋
To calculate the exact value of energy savings by turning a light off, you need to first determine how much energy the light(s) consume when on. Every bulb has a Watt rating printed on it. For example, if the rating is 40 watts, and the bulb is on for one hour, it will consume 0.04 kWh, or if it is off for one hour, you will be saving 0.04 kWh.
Then you need to find out what you are paying for electricity per kWh (in general and during peak periods). You will need to look over your electricity bills and see what the utility charges per kWh. Multiply the rate per kWh by the amount of electricity saved, and this will give you the value of the savings.
Heating and cooling account for about 56% of the energy use in a typical U.S. home, making it the largest energy expense for most homes.
2007-02-10 06:56:07
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
Time to get a new roommate. Or, if you can get the wattage of the heating system and the light bulb and show him that the heater uses way more, that might work, if the heater is electric. But I suspect not. It will be even harder if you have gas heat.
2007-02-10 06:56:07
·
answer #5
·
answered by campbelp2002 7
·
0⤊
0⤋
Running the heater all day is going to use a lot more energy then forgetting to turn off a few lighbulbs.
Want to prove it? Compare your electric bill to your gas bill.
2007-02-10 09:01:57
·
answer #6
·
answered by Roman Soldier 5
·
0⤊
0⤋
Nuclear potential has been shown to be risky while corners have been cut back to keep on costs. Nuclear could be made to be secure, even yet it would not be low-funds then. that's probably no longer abolished, merely required to be secure- that way no you're able to use nucular potential and alt potential could take over.
2016-10-01 22:23:57
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
dont you want to look around in a warm house and not see smoke.
2007-02-10 06:55:37
·
answer #8
·
answered by Wattsup! 3
·
0⤊
0⤋
who care ?
2007-02-10 06:54:02
·
answer #9
·
answered by sammy 5
·
0⤊
1⤋