Hi Brian, I can most definately say that if your PSU doesn't quite cover the power needed by the graphic card then it will affect performance. For instance, say you are running one of the latest games then they will be more visually intense thus drawing more resources from the graphic card. It may run fine to a certain degree but without the proper power supply then its going to overheat and will, at the very least reduce the life of the graphic card, and at the worst cause it to stop working alltogether. If you replace the PSU with the 550w then I can confidently say that it will absolutely improve the performance of the graphic card. Having the 550w PSU will also enable you to add upgrades to the MB at a later date, maybe a new DVD drive or Hard disc, whenever needed without the worry of having enough power. If it was me I would go with the 550w PSU.
Hope this helps
Regards
Ron
Added 04/10/07
dunno who is giving all these answers the thumbs down but if they would like to post their own answer then i would like to read it
2007-10-02 10:53:43
·
answer #1
·
answered by ronald8826 3
·
0⤊
1⤋
Despite what that last answer said, yes in fact you should concern yourself with the current. As he said, wattage equals amps times voltage. What all those numbers mean are the corresponding voltage rails, and the number of amps that the PSU will supply on that rail. By far the most important to look at is the +12V rail, because that is what both the CPU and video card will run on, and those are the power hungry parts. A lot of video cards will not only tell you how many total watts your power supply needs, but also how many amps you need on the +12V rail. If your power supply as a whole has enough watts, but your +12V rail doesn't have enough amps, you will still run into problems.
As for underpowering the graphics card, in my experience this usually ends up being more of an on/off situation, either you have enough power and it works, or you don't have enough, and it doesn't.
2007-10-02 12:32:02
·
answer #2
·
answered by mysticman44 7
·
1⤊
1⤋
Remember: power (wattage) = voltage x current (amperage)
In this case, voltage and current really do not matter, and all you should concern yourself with is power.
Every component consumes a certain amount of power, and for the system to run properly, the maximum amount of power supplied by the PSU must be greater than the amount of power consumed by the entire computer.
Given that, you can not say a single component, such as a graphics card, requires a X number of watts power supply, because it really depends on the entire system. I know some manufactures do this, but they are making worst case assumptions with this statement to ensure their product will work in a wide variety of systems.
Now, if the power supply is not able to supply enough power, voltage will begin to drop on whichever line(s) that have exceeded the maximum amount of power they can supply. This will cause all sorts problems, such as fragmented data, system instability, etc.
It has become increasingly popular to use ever more powerful PSUs. In some cases, where you are powering a bunch of lights, fans, disks, etc, it is necessary, but I think in most cases PSU ratings come down to more a matter of 'mine is bigger than yours.' One problem which drives me nuts is most peripheral manufactures don't list how much power their device consumes, making it very difficult to calculate exactly how much power you need.
I have a fairly recent machine with an NVidia graphics card, 6 case fans, 2 hard disks, and 2 CD-ROM drives running on a 350 watt PSU without any problems. I admittedly was concerned about this configuration when I first built it, and I probably wouldn't recommend or suggest to someone else that a 350 watt PSU is enough for this setup.
So while I doubt you are running into power shortage problems, if you are legitimately experiencing system instability, the PSU may very well be the cause. If you want to upgrade to a 550 watt PSU, go for it, it won't hurt anything and it will supply more power than any desktop PC should ever need.
2007-10-02 11:23:57
·
answer #3
·
answered by limaxray 3
·
1⤊
2⤋
That depends , the overall wattage is the raw power , so if you have a 350 watt psu and your system needs 400 watt you will probably have random system restarts without warning . This can damage the power supply and the components in the pc.
Different components may require a certain minimum amperage to function. This is usually the motherboard itself, the graphics card and the cpu. The amperage requirements for these components will be on the twelve volt rails and if you have insufficient amperage they probably wont function at all.
A graphics card is likely to be the least problem as it comes under the total amperage of the combined two or three twelve volt rails depending on your psu.
2007-10-02 11:02:04
·
answer #4
·
answered by randomnomenclature 3
·
0⤊
1⤋
Those are good answers, No problem with excess current rating for voltage regulated supplies however there are constant current supplies, 4.5 amps would likely destroy a 3.5 amp device. They are used for LED lights and for welding and a few other applications. There are also unregulated supplies that might supply excessive current and voltage to a devise = some wall warts are designed to supply 6 volts at one amp/ if the load is designed for 0.1 amp at 6 volts; it may get as much as 9 volts, at 0.14 amps, which could be destructive. Fortunately most supplies are voltage regulated. Neil
2016-04-07 00:55:24
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
New power supplies have higher wattage/amperage on the +12V rail or rails. This is because current processor circuit and graphics cards draw mainly from the +12V rail. A power supply with higher than required wattage runs way below full load, thus runs cooler and tends to last longer. I would expect the 550 watter to easily power mainstream graphics cards all day long.
2007-10-02 11:04:42
·
answer #6
·
answered by Karz 7
·
0⤊
1⤋
The wattage of the psu depends greatly on the amount of stuff you have on the pc. For instance a basic pc will run on 300 watt psu the biggest drain of wattage is usb as this pulls power from the motherboard as do the keyboard and mouse.
Add more boards such as sound graphichs and memory also drain power. so if you have all of these you will need a psu that can supply all of them so adding all these to the 300 watt basic pc will need to be increased to at least 500 watt to increase stability and reliability.
2007-10-02 10:57:13
·
answer #7
·
answered by Ti-2000 3
·
0⤊
1⤋
If your power supply is only 350 watts you are in effect over running the power supply by 50 watts and will get a bit warmer than it should ,I would go for the bigger wattage type,it will not hurt to go bigger but dont go for a smaller wattage type so long as the voltage output is the same as you require.
2007-10-02 10:49:13
·
answer #8
·
answered by Anonymous
·
0⤊
3⤋
I never figured out the math, buut what you r looking at is the different voltages and amperrages as laid out by the manual.
2007-10-02 10:55:27
·
answer #9
·
answered by Joe 4
·
0⤊
2⤋
the reason it is laid out like that is to show you what ampage each connector has in order of motherboard/ hard-drive/ graphics card/ etc...
2007-10-02 10:41:29
·
answer #10
·
answered by Anonymous
·
0⤊
2⤋