Use Rivatuner, you can monitor your temperature and see how high it goes, without overclocking, and while you're using your graphic card running 3d games or benchmark. As long as it doesn't goes 20 30 degrees Celsius above normal you should be alright
But more often than never, your first problem will be voltage clipping, not heat. You can search online to read up on more details on this common issue, where the faster your frequency is, the signal voltage amplitude decreases, once it goes too low, that's when all the common overclocking problems shows up like corrupted texture strange dots all over your screen.
What user normally does is increase the voltage level to compensate for this loss, but it's more commonly done for CPU and RAM. Increasing the voltage for your graphic card is never an easy thing to do, sometimes requiring you to modify your hardware, like adding a different resistor to certain part. Some older card can be done by just changing the graphic BIOS, but this isn't an advisable thing to do, let alone modifying your hardware.
It is from this increase in voltage that really cause the big jump in temperature. So long as you leave the voltage alone, you should be just fine. Overclock until you see graphic corrumption when playing games, then tone down a little bit..
2007-06-19 21:57:55
·
answer #1
·
answered by Hornet One 7
·
0⤊
0⤋