English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I have a measurement device that I want to calibrate. From observation, it appears that a quadratic cal equation will be needed, i.e., y = ax^2 + bx + c where y is the value to be displayed, x is the raw value produced by the device, and a, b, & c are the calibration coefficients. The plan is to simultaneously take 'n' different measurements from the device (i.e., a list of 'x' values) and from a calibration standard (i.e., a list of 'y' values) that are approximately uniformly distributed over the device measurement range and then to use least-squares regression to compute a, b, & c. The minimum value of 'n' is clearly 3. Intuitively, increasing 'n' will reduce the error of the calibration process, however I'm hoping to get some quantitative guidance in choosing 'n'.

2007-01-09 12:19:52 · 2 answers · asked by Calli Braytor 1 in Science & Mathematics Engineering

2 answers

I would go with either 3 or 30

3 is the minimum number for non-linear situations.

4-29 are more work but may not give better results

30 can be statistically proven to represent and entire population

2007-01-09 12:44:16 · answer #1 · answered by MrWiz 4 · 0 0

depending on your margin of error, you can use as few as two, but normal usage valuses use a minimum of three. as "n" increases,the standard deviation can fluctuate causing data skews.

2007-01-09 20:34:53 · answer #2 · answered by Brian F 4 · 0 0

fedest.com, questions and answers