English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2006-08-03 03:42:30 · 8 answers · asked by Aggronaut 1 in Computers & Internet Hardware Other - Hardware

8 answers

200 watts...depends on your monitor

2006-08-03 03:47:24 · answer #1 · answered by Anonymous · 3 0

Electricity is usually measured in "units", the units being KWH. That is kilowatt - hours. So an appliance that uses a kilowatt for 1 hour takes 1 unit of electricity. Most PCs have a 200W power supply - some modern ones 300W. So if your PC is at maximum power it will use 1 unit every 5 hours! But it's likely to use half that or maybe even less. The monitor uses more maybe 500W - less if it's a flat screen LCD. The best way is to read the label on your system box and monitor and anything else plugged in and add up the watts. It will probably come to around 500 Watts and so will use 1 unit every 2 hours. But most PCs turn off the monitor to save energy and go in to sleep mode or hibernate. to save energy. Especially, laptops that run on batteries and of course have a low energy flat screen. I'm using a laptop and helping to save the environment!

2006-08-03 03:55:43 · answer #2 · answered by Mike10613 6 · 0 0

I dont know wether u use baby AT. Atx or Micro atx its depends upon......... however the latest Micro Atx with 400 watts are advised by proffesionals. besides this yr Monitor , Printer and a Cable modem all around presume that is takes maximum of 700 watts. But when u coe to the power bills it wil be nothing for you. Because its depnds upon Amps. When all the consumables use is by a stepdown transformer or Ac to DC convert volatage doent cost u much or we can say simply nothing.I can just give a example. Just disconnect or use ur PC other than your house check the power bill and then u use ur PC at your home nd check the bills. you find no difference hardly a $2 or $1 more. o.k no tention for power bills in future.... bye.

2006-08-03 04:10:13 · answer #3 · answered by shabnavees r 1 · 0 0

as to how much in quantity your machine uses over a period of say 1 hour, i am not sure

but if your thinking of saving power, and use your machine heavily, i would reccomend to not switching it off at all


for instance, this machine is in operation (in use) for about 20 hrs everyday, with about 4 hours that it is not being used

due to the huge amounts of power it requires to boot the thing up, it is not worth it to switch it off ever, except for the odd reboot when applications require it, currently my machine has not been rebooted for 7 days, switched on 24/7

longest period it has been in use is 1 month with no reboots.

If you are old enough to work in an office, you would notice that most companies do not switch their machines off at the end of the working day, this is because of the power required to start them up, and it is cheaper to simply leave them running.


similar to a car, a car uses lots of fuel when starting them up, so instead of switching it off for a period of 10 mins of idle, then restarting it, just leave it on for the duration

2006-08-03 04:02:02 · answer #4 · answered by paul_heilbron 3 · 0 0

i have a KILL-A-WATT meter I did this on mine, its an e-machines 2.60 with 17 inch monitor [not flatscreen]
standby=s
computer monitor watts
off, off 6
stby, s 10
on, off 52
on, s 61
on, on 112
on+aol, on 122
columns dont want to line up but u can figure itout.

2006-08-03 04:17:47 · answer #5 · answered by enord 5 · 0 0

averagely 350 watt depending on ur PSU spec

2006-08-03 03:53:40 · answer #6 · answered by J's 1 · 0 0

Depends on your power supply and how much your system demands.

2006-08-03 03:48:34 · answer #7 · answered by HotRod 5 · 0 0

a lot

2006-08-03 03:46:24 · answer #8 · answered by Anonymous · 0 0

fedest.com, questions and answers