English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I don't understand how many significant digits you should have when you're measuring things. Say you're measuring on the cm side of a ruler and it shows 3.5 cm. Would you put 3.5 cm or 3.50 cm or 3.500 cm. I don't understand how many significant digits there should be. Can someone please explain it to me so I'll know how to measure anything?

2007-09-16 13:38:51 · 2 answers · asked by ... 2 in Science & Mathematics Chemistry

2 answers

When using anything with an analog scale(such as a ruler) you should estimate one decimal place past the finest marked interval.

Let's say, for example, that your ruler is marked to tenths of a centimeter. You should find the tenth of a centimeter mark to which the thing you're measuring is closest but without being past. Then, you should mentally make 10 additional tick marks between the two intervals, and estimate to another decimal place where the thing you're measuring falls.

It could very well be 3.50cm, but it could also be 3.53, 3.54, 3.57, etc.

2007-09-16 13:53:51 · answer #1 · answered by Ben H 6 · 0 0

The number of significant digits you can use depends on the precision of the measuring instrument. The precision is the smallest amount that can be distinguished in the measurement. In a ruler, a 1cm interval may be divided into 10 parts; you could therefore measure to a precision of 0.1cm (.001m). If your overall measurement was 1m, you could have 4 significant digits x.xxx; this is because anything smaller than 0.001cm has no meaning, as it cannot be read on the ruler. However, if the overall meaurement is only 1cm, you can only have 2 significant digits, x.x. Therefore the number of significant digits depends on the size of the measured quantity and the smallest amount that can be distinguished in your measuring instrument.

2007-09-16 13:50:22 · answer #2 · answered by gp4rts 7 · 0 0

fedest.com, questions and answers