English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

i was wondering what is the total watts for a desk top computer

2007-09-02 14:59:36 · 9 answers · asked by buddyboy 1 in Computers & Internet Hardware Desktops

9 answers

Dude, they're all different... You've got to add up the power consumption of each component (main board, cpu, graphics card, audio card, floppy & CD drives, etc and the power supply and see what you come up with.

2007-09-02 15:03:57 · answer #1 · answered by Jason 6 · 0 0

Well that depends... A fairly modern computer might have a 200w to 450w power supply. The more watts, the more stuff can draw power within the computer (really fast CPU, great graphics card) and the more power used. My old computer has a 100w power supply. Another has a 145w PS. And a really high performance workstation or gaming rig might have a 1000w power supply. Just google the question and you may get some great results!!!

2007-09-02 15:11:36 · answer #2 · answered by deth 4 · 0 0

Different computers use different amounts of power.

Typical pc's use 30-70 watts when in idle and can use hundreds of watts when under heavy usage.

LCD monitors use less power than glass CRT's.
Many of the newer cpu's run at 1/3 the wattage of the previous generation cpu's.

#

2007-09-02 15:06:43 · answer #3 · answered by Anonymous · 0 0

Check your monitor and your power supply in the computer.

The average modern power supply uses 750 watts. A standard cathode ray tube monitor uses 60 watts. LCD monitors probably only use 40 watts. That means your average computer uses 0.081 to 0.0790 kilowatts per hour. My power company is charging me (robbing me) 14.82 cents per kilowatt so I am paying 28 cents per day to run my computer (assuming I run it 24 hours).

All the USB devices, the cards and other things inside of the computer run off the power supply and the power supply is rated at its MAXIMUM output. If you don't have so many bells and whistles you won't have much power consumption. If you use a printer or scanner then you will have to add that power to your calculations.

Power companies charge in kilowatt/hours and a kilowatt = 0.001 watts.

2007-09-02 15:11:31 · answer #4 · answered by Dan S 7 · 0 0

#1 is right, it depends on what you have in your computer, but getting a computer to use 1000watt power supply is extremely hard to do with a single computer. I have a gaming computer with high end parts, i have a 550 wat power supply and i still have about 100-200 extra watts, my computer uses about 350-450watts depending on how demanding the program is.

2007-09-02 15:09:01 · answer #5 · answered by applebeer 5 · 0 0

i'm believe Mr. joe he forgot to tell one greater step this is the electricity expenses are measured with units(this is not something yet 1000 Whr or a million kWhr) ie a million unit = 1kWhr (or) 1000 Whr. added you are able to desire to attentive to value for 1unit then in basic terms you should pay in a suited contribution.(the small print you will get from close by EB)

2016-11-14 01:07:10 · answer #6 · answered by ? 4 · 0 0

Depends on your system stats, your PSU, etc.

Can be from less than 100 watts to almost 1000.

2007-09-02 15:02:59 · answer #7 · answered by Anonymous · 1 0

I have a HP that uses 260 watts (watts / Volts = amps) 2.366 amps

2007-09-02 15:06:31 · answer #8 · answered by Anonymous · 0 0

it usually depends on what type of computer you have

2007-09-03 01:53:54 · answer #9 · answered by jess @ work 1 · 0 0

fedest.com, questions and answers