English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If I have a mostly stock computer, the only change is that i upgraded to a 7600 gt video card. I don't know my current power supplies wattage but the guy at best buy said that the video card can run at it's best with 300 watts of power. Do I need to upgrade my power supply for this to be better.

Honestly I can't see how my computer could run any better but I am asking if anyone has changed their power supply and gotten better performance out of your video card

2007-04-30 13:06:26 · 5 answers · asked by Wastedmilkman61 3 in Computers & Internet Hardware Other - Hardware

5 answers

Power supply units run cooler and last longer if the load is 80% or less.

With the 36 watts of your 7600GT that you added, you could be running your power supply (or one of the rails) at or near maximum load. Stability issue may crop up later.

A bigger power supply will not improve the card's performance but could provide better reliability/stability over the longer term.

2007-05-01 02:02:44 · answer #1 · answered by Karz 7 · 0 0

Power supplies are a tricky thing. It all depends on what you have inside running also:
Mobo-35-45w
CPU--Intel 45-95W AMD - 35-85W
Memory -- 7w per 128mb of memory.
Video card -- 27-75w
Hdd- 45w
DVD/RW/CD/RW- 50w
PCI cards 14-35w
10% more for adding components later....
Just @ least 300w to run the low end computers....
I have AMD Dual Core and 2*300gb hdds with 2 DVD Burners, only 256mb video, but 3gb of memory and not PCI cards and mine totalled around 467w needed but I opted for a 600w to insure future upgrades...Thanks

2007-04-30 13:17:26 · answer #2 · answered by computer_surplus2005 5 · 0 0

between the answerers above claimed a computing gadget in use is like having on a 40watt bulb. nicely, in the journey that your 40watt bulb attracts approximately a hundred and fifty-200watts, i could heavily ask for my money lower back. i've got checked particularly some the desktops i've got owned with a means-intake meter, and that they have got a tendency to apply between a hundred and fifty-200watts. means is measured in watt/hours, so a computing gadget like this left on for an hour could use 200watt/hours of electricity. those desktops have had processors interior the a million.5Ghz-2GHz variety. once you're employing a very quickly processor and additionally you do particularly some gaming with a posh pics card geared up, you should probable stick yet another one hundred-150watts onto that. A CRT visual demonstrate unit makes use of a honest quantity of means too - my old 15inch one averages approximately 80watts. liquid crystal demonstrate video demonstrate instruments tend to be much less means hungry than CRTs. Leaving desktops on whilst they are not in use isn't in straightforward terms a waste of money yet for sure additionally an uneccessary contribution to international warming. yet another element to recollect is that digital factors have constrained lifespans, often measured in tens or hundreds of hundreds of hours. Capacitors (you will locate particularly some those on your means grant and on your motherboard) have shorter lifespans than many different factors, appreciably shorter in the event that they're run in heat or warm environments.

2016-10-14 05:20:11 · answer #3 · answered by ? 4 · 0 0

if your video card is installed already and it runs fine and you dont plan to overclock the machine, you dont need a new powersupply. increasing the powersupply will not effect the cards performance.

2007-04-30 13:11:09 · answer #4 · answered by The Great One 5 · 0 0

Play a game that uses a lot of graphic power and see if it stalls.

2007-04-30 13:29:53 · answer #5 · answered by ? 5 · 0 0

fedest.com, questions and answers