English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2 answers

12 bits gives a theoretical resolution of 1 part in 4096. In practice it is more like 1 part in 2048, by the time noise is factored in. Even for the best instruments, 1 part in 2900 is all that is practical.

1 part in 2900 is 345 parts-per-million. Apparently most laboratory measurements only need that kind of resolution.

I worked on a siesmic systems that required 24-bit digitizers, but they could detect people walking, 50 feet away. Obviously not everyone need *that* kind of accuracy.

.

2007-08-17 05:23:39 · answer #1 · answered by tlbs101 7 · 0 1

An analog signal varies continuously and therefore can have an infinite level of amplitudes. This infinite number of values is represented digitally by discrete values. The difference between the digital level and the analog level is quantization error. More bits in the quantizer means finer granularity in the digital levels and a smaller quantization error. See the reference in "sources" for the math. Bottom line is that for each bit in your quantizer, you get 6.02 dB more signal to noise. For 12 bits, this works out to a signal to noise ratio of 72.24 dB or one part in 16.7 million. Good enough for most laboratory measurements!

2007-08-17 12:37:42 · answer #2 · answered by Űbergeek 5 · 1 0

fedest.com, questions and answers