English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

it has to to with logarithmic tables

like with star brightness, they'll say the light was on an order of magnitude five times any other, which means it's a hundred times brighter

10 squared is 100, I don't get this

2007-09-30 11:31:03 · 1 answers · asked by SQD 2 in Science & Mathematics Physics

1 answers

Since we generally think decimal, one order of magnitude is usually considered to be a factor of 10. Greater sensitivity means having a smaller minimum discernible signal. So an order of magnitude greater in sensitivity means able to detect a signal 1/10 as great. Basically the "units" of these orders of magnitude are base-10 logarithms.
Star magnitudes, the Richter earthquake scale, and time constants of exponential growth and decay all involve some base number and the units are the logarithms to that base. The base of star magnitudes is 100^0.2 or about 2.5, or as you say, 5 magnitudes is a ratio of 100 (see the ref. for the reason why). The Richter scale is base 10. Time constants are base e, that is, natural logarithms.

2007-10-02 16:05:04 · answer #1 · answered by kirchwey 7 · 1 0

fedest.com, questions and answers