English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I have just bought an digital pressure gauge (Digitron 2083 P) for some medical research. in a journal that i will be writing to , i need to tell them about the accuracy and precision of the instrument. The leaflet tells me this: 0.3%rdg +0.3%fs + 1 digit
unfortunately i do not understand what this means. i need to explain it to a medical community but i have to understand it first !

2006-12-17 07:35:16 · 2 answers · asked by Nirmala 4 in Science & Mathematics Engineering

the operating range is 2 bar

2006-12-17 21:38:16 · update #1

2 answers

0.3% rdg is usually per degrees C or degrees F- it is a correction as the same pressure will give two different readings if the temperature of the gauge is different. In a medical research laboratory, this should not be a big deal beacuse the temperature should be fairly controlled. If the temperature is allowed to vary 5 degree- your accuracy could off 1.5%.

The 0.3% fs says that the machine has mechanical variation which is 0.3% of the full scale. If the pressure gauge is 0-100 PAscals, the true pressure could vary 0.3 PAscals.

+ 1digit indicates that the variation is a little more than the 0.3 it could be 0.39 the "9" being the added digit.

2006-12-17 07:59:09 · answer #1 · answered by MrWiz 4 · 0 0

I can't answer the question without knowing the confidence limits of the accuracy rating. This could be the 68% confidence limit, or it could be the 95 or 99% confidence limit. These confidence limits correspond to +/- 1, 2 or 3 standard deviations from the mean.

What the rating is getting at is:
0.3 percent of the reading
+0.3% of full scale
+ the 1 unit of the lowest count on the meter

I am not familiar with the instrument, but say you had a 1000 count screen, which actually reads 0-999.
If your pressure reading was 7.00 bars on a scale that reads from 0-10bars (actually 9.99), the lowest count on that scale of the meter would be 0.01 bars

Your "accuracy" would be 7.00 +/- (0.003*7.00+0.003*10+0.01)
so 7.00 +/- 0.06. Assuming that the rating is a 68% confidence limit, your actual accuracy is 7.0+/- 3*0.061 - or 7.0+/- 0.2. Meaning that 99% of the time the correct value lies within those limits.

Your precision is just the finest it can read on the range the measurement is taken on, which would be 0.01 bars.

2006-12-17 17:14:07 · answer #2 · answered by Ron E 5 · 0 1

fedest.com, questions and answers