English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

When we calculate standard deviation, we'll square the differences first in order to get variance and then take the square root of it. But why don't we cube differences first (or differences to the power of 1.2 or whatever) and take cube root(or whatever root)?

2007-12-01 02:35:49 · 2 answers · asked by nil 2 in Science & Mathematics Mathematics

I hope I use proper english......

2007-12-01 02:36:31 · update #1

2 answers

One of the effects of squaring a difference is that the square is always positive, even if the difference has a negative sign. This is common in many 'distance' functions where you want te function to never be negative.

The standard deviation is a measure of how accurately a variable represents a mean. If the variable is off by +2 as often as it is by -2, then the simple addition of the error (or its cube, for example) would cancel each other out.

There would be no way to distinguish between an inaccurate variable that is constantly off by +/- 2 and one that is off by +/- 0.1.

2007-12-01 02:41:43 · answer #1 · answered by Raymond 7 · 0 0

The variance is a measure of spread of the data but its unit will be square of the unit of measurement and positive square root of variance is standard deviation having the same unit as unit of measurement. If we change the origin at the mean and scale at the standard deviation, the transformed data will have mean zero and variance one. Similarly at many applications we need standard deviation. Thus it is defined. The average of the cube of differences from mean is third central moment and used for checking the skewness of the data. It has unit as cube of the unit of measurement. since we don't have the use of cube root of third central moment (as we need the square root of variance for standardizing the data) we do not take cube root.

2007-12-01 11:06:09 · answer #2 · answered by meshu 1 · 0 0

fedest.com, questions and answers