Most people just accept the concept of significant digits and never actually think about why it is used. In particular, the two responses I have seen to your question don't get at the real meaning.
The reason for significant digits is:
it measures the RELATIVE precision of a number. In other words, it is indicative of the potential PERCENTAGE error in the value.
Examples:
If we are given the number 8.6, and we know that it is NOT 8.5 or 8.7, we know the value within an accuracy of 0.1 / 8.6, which is a little over 1%. And if we were given the number 0.0086 and knew it was not 0.0085 or 0.0087, we would again have 2 significant digits and would know the value within about 1%. With 3 significant digits, the potential error would be about 0.1%, so the accuracy would be about 10 times as high.
But what if the number we're given is 1.2. Again we have 2 significant digits, but the potential error is about 1 in 12 (0.1 out of 1.2), which is more than 8%. Why the big difference from the previous calculation with two significant digits? It's simply because the number starts with 1 instead of with a large digit (8 in the previous example). So being off by 1 in the last significant digit is a larger percentage of the number.
Conclusion: Significant digits is a ROUGH concept. A given number of digits gives a GENERAL idea of a number's accuracy, but it is not a very precise measure of accuracy. It would be more meaningful (but more time-consuming) to give the number as (for example) 1.2, "plus or minus .06" (or whatever the potential uncertainty happens to be), or "plus or minus 2.3%," for example.
Final note: A warning. The concept of significant digits relates to numbers that are being multiplied together (or divided). If you have 3 numbers to multiply together, and each is accurate to 4 significant digits, you can keep 4 digits in the product and expect them to be reasonably accurate (i.e., you basically get 4 significant digits in the answer).
What you CAN'T do is ADD (or subtract) the numbers, since then the question isn't "how many significant digits", but "to how many decimal places" are the numbers accurate?
2006-10-04 17:00:07
·
answer #1
·
answered by actuator 5
·
0⤊
0⤋
In the real world measurements are usually accurate to three significant digits, regardless of the placement of the decimal point. Any more then three digits, then the scale is wrong. The rule of thumb in experiments is that the answer cannot be any more accurate than the lease accurate number. A good engineer will select scales and measurement devices that will return equally accurate numbers. For example, the average of the following numbers can only have on significant digit; 1.222, 2.2, 3, 4.3, 5.4, & 7.8. The least accurate number is "3". If this answer is acceptable, then there is no reason to pay for a machine that will measure beyond one digit. In car races speed is usually given in six significant digits (ie 185.124). This number is meaningless because the distance travelled is unknown. The only important number to known is the amount of time it takes to complete a lap. Time can be measure quite accurtely. Ask any race car team manager and he will discuss time, not speed.
2006-10-05 00:33:18
·
answer #2
·
answered by Richard B 4
·
0⤊
0⤋
lets say for example by doing an experiment you find the value of two numbers as 7.63 and 1.45. and now you need to find the result of the experiment by taking the square root of both and multiplying them.
sqrt 7.63 = 2.7622454633866266882253758278599
sqrt 1.45 = 1.2041594578792295480128241030379
sqrt 7.63 * sqrt 1.45 = 3.326183999723856724391513097231
But how accurate is the result? cause in your experiment you found the values as 7.63 and 1.45 and the result you found has got 32 digits after the decimal point.
this is where the concept of significant digits come in. the significant digits in your result are two after the decimal point. so
sqrt 1.45 = 1.20
sqrt 7.63 = 2.76
sqrt 1.45 * sqrt 7.63 = 3.312
but the third position after the decimal has no meaning cause your experimental results are true only upto two places after decimal. so the most accurate answer is 3.31
i hope u did understand the concept of significant digits now.
2006-10-05 11:50:27
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Because sometimes it's necessary to have an answer as exact as possible. Significan digits allow us to determine just how accurate an answer can be without introducing error.
2006-10-04 23:02:44
·
answer #4
·
answered by bequalming 5
·
0⤊
0⤋
Well, if you and me each measure something, say the speed of light, and your answer is 301000 km/s and mine is 300000 kps and both our margins of error are less than 1000 kps, then one of us must be wrong. But if our margins of error are larger, then maybe we're both right. Sig. digits are a way of showing your margin of error, so mistakes can be picked up.
2006-10-04 23:03:31
·
answer #5
·
answered by zee_prime 6
·
0⤊
0⤋