In general, higher voltage means lower current, and therefore smaller wires (less expensive) with everything else equal. The equipment to run at the higher voltage is probably more expensive though. Operating cost should not change (the other answer about kilowatt-hours being lower is wrong).
This is hardly a DIY question. You don't have these voltages in your home. If you are an electrician I would hope you already know this. Smaller commercial buildings will get 208V 3 phase. Larger buildings get 480V, which is 277 phase to neutral. Running lights at 277V is common, and the HVAC in those buildings would be 480V, and a transformer would provide 208V/120V for smaller loads and receptacles. Even larger buildings will get higher (primary) voltages. The power company won't even give you certain service sizes unless your demand load is high enough.
2006-10-05 02:46:20
·
answer #1
·
answered by An electrical engineer 5
·
1⤊
0⤋
Both are same in operation. But the 480V would last longer as it works in most power supplies without the fuse blowing off. The 208V might give you the troubles of replacing fuses very often.
2006-10-04 18:32:18
·
answer #2
·
answered by Blessmegod 1
·
0⤊
1⤋
It would cost less to run the 480v,rather than the 208v.Since power companies are charging you per kilowatt hour. By that I mean you are being charged by the amount of amps you are consuming,not the amount of volage. Ohms law of electricity(simply put) says, the more more voltage one uses will require less current(amps).Kilowatt hours are calculated by using amps and horsepower in the same equation.
2006-10-04 19:17:55
·
answer #3
·
answered by dewhatulike 5
·
2⤊
1⤋
As far as motors are concerned, the more voltage applied the less amperage required to operate.
2006-10-05 00:39:02
·
answer #4
·
answered by RLA 2
·
1⤊
0⤋
this sure sounds like a trick question i don't know of anywhere you have a choice like this
2006-10-08 14:42:08
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋