English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

7 answers

Well, I can't say I agree 100% with any answer above. When one seems like it makes sense, it goes off on a wild tangeant that's way off!

Yes, you can use a wattmeter to monitor a more exact figure. But realize that most modern PC's (built within the last 2-3 years) will consume between 125-200 watts when idle, and then anywhere from 175-350 watts when busy. Some can max out even more than that, but the average PC will struggle to use more than 300 watts at any given time.

So as you can see, how you use your computer matters as well as the type of hardware you have installed. You didn't say why you wanted to know, so you received quite a few different responses.

I will say that I disagree with the answer above about wear & tear on a hard drive by using the power-save modes. That's completely false. A hard drive will spin frequently while a PC is left powered on, even when the OS is idle. Processes running in the background use it constantly. If I use my PC only for 1-2 hours each day, it would make sense for the PC to use a standby setting (say after 2 hours of being idle). There will be a lot less wear and tear on the drive as a result.

2007-02-04 19:34:04 · answer #1 · answered by SirCharles 6 · 1 0

Hi,
In terms of wattage, typically a PC consumes 360 to 400 watts with a CRT Monitior. But if you have gadgets attached to it, the wattage may increase. If a printer is added, anthor 30-40 watts.. but anyway, a full pleadged computer will consume not more than 500 Watt at the peak.

2007-02-05 03:09:42 · answer #2 · answered by Mohan 2 · 0 0

between 150 and 500 watts for a desk top and moniter. about 60 to 120 watts for a laptop. To know for sure you will need to get an amp meter and mesure the current draw and multiply that by your supply voltage to get your watts used. You and realy reduse that consumption by setting your power managment utility to turn off the moniter and hard drives. I never turn my computers off so that has the added value that it limits wear and tear on the hard drive and extends it's life.

2007-02-05 03:12:36 · answer #3 · answered by maintman73 2 · 0 0

It depends on what kind of Processor, Video Card you have.
For example a Pentium II 233 MHz processor, S3 4MB Savage Video Card, 1.2 GB Harddisk, it'll take around 180 watt maximum.
But For C2D 2.66, 8800GTX SLI, RAID 0 Raptor 150GB, can take around 800watt.
Here's a link to calculate how much electricity is consumed by your PC.

http://www.journeysystems.com/power_supply_calculator.php

2007-02-05 03:18:08 · answer #4 · answered by Anonymous · 1 1

I really dont know bu i do know this i trn the switch off on my surge protector every time i turn the computer off and my elctrioc bill went down 14 dollars a month 3 months running now

2007-02-05 03:08:31 · answer #5 · answered by Anonymous · 0 1

Turn off every electrical appliance in your house except for your computer. Then, go outside and look at your meter.

2007-02-05 03:13:11 · answer #6 · answered by Anonymous · 0 1

it takes which a 60 watt bulb takes.

2007-02-05 03:17:29 · answer #7 · answered by amazing 2 · 0 1

fedest.com, questions and answers