Your answer is only as accurate as the information you were given to compute it. If your source data only has two significant figures, giving your answer to seven significant figures won't make it any more accurate than giving it to two significant figures.
2007-09-04 14:41:21
·
answer #1
·
answered by Scarlet Manuka 7
·
3⤊
0⤋
You are addressing significant digits. It is pointless for me to tell the foreman that I delivered 6914 ounces of milk when all the rest of the people are delivering - and measuring - in quarts... and I may being paid by the quart. My 6914 oz. (432.125 quarts) is going to be measured as 432 quarts... the .125 qts. just sort of "goes away".
It is really a formalized method of "rounding-off": The idea is: If you are building your answer upon several numbers - and one of them is only accurate to 1 decimal point... when you multiply others by this number..., when you divide by this number... does your resultant or dividend have any more veracity or "provability" than the weakest link in the chain? NO. The answer is only "accurate" to one decimal point.
It is different with addition and subtraction - because the individual numbers don't influence their bretheren.
The fact that our calculators will carry-out an answer to 6 decimal points does not really mean there is any point!
2007-09-04 15:05:17
·
answer #2
·
answered by Richard S 6
·
0⤊
0⤋
If you mean the number of digits that follow a decimal point, it will be more accurate.
2007-09-07 14:40:21
·
answer #3
·
answered by johnandeileen2000 7
·
0⤊
1⤋
An answer with more digits is in all likelihood more precise, but it is not necessarily more accurate. An answer with more digits behind the decimal can be more precisely wrong, if the answer is wrong to begin with.
2007-09-04 14:43:03
·
answer #4
·
answered by Candidus 6
·
1⤊
0⤋
Ah I remember this vaguely. Something called significant figures... That if your measurement is off +- .05% having an answer that is 3.238742973 ... everything is meaningless past the 3rd or 4th digit because it could really be anything within the margin of error.
2007-09-04 14:42:13
·
answer #5
·
answered by disruption_grey 4
·
2⤊
0⤋
its more accurate for sure. but the problem with decimals is that most of the time their not exact. you can write .3333333333 until your hand falls off but it will never make 1/3. a fraction is more accurate and easy to work with.
2007-09-04 14:41:34
·
answer #6
·
answered by Anonymous
·
0⤊
1⤋
because it is more accurate than it is.
2007-09-04 14:42:11
·
answer #7
·
answered by *mRs.GaBrIeL* 5
·
0⤊
3⤋
huh...
IDK
2007-09-04 14:41:25
·
answer #8
·
answered by buckshotbullies 3
·
0⤊
3⤋