Mean is the average value. Average is sometimes misleading. For example in a class we have 5 students of ages 13,14,14,15,34. the average value comes to18. None of the student represent this or any where near this.This has happened because of one value has distorted the whole picture. On the contrary if we take the variance and standard deviation this will at once be revealed. SD tells you how much the data is spread? whether data is skewed?or if it is peaky? and so on. Thus it gives a greater insight in to data.
2006-10-02 07:37:32
·
answer #1
·
answered by openpsychy 6
·
3⤊
0⤋
Mean represents a sort of "weighted center" of a distribution of data.
Physically, the "mean" of a probability distribution is equivalent to its "center of mass." In other words, if you cut out a probability distribution (like a bell curve) then you could balance it on your finger right at the value of its mean.
For example, picture a probability distribution where there is a 20% chance of getting a 5 and an 80% chance of getting a 10. In that case, the mean is 0.2*5+0.8*10=1+8=9. As you can see, there is 80% "mass" exactly 1 away from 9 and there is 20% mass exactly 4 away from 1. 80%*1 = 20%*4. If the probability distribution was a lever (like a see-saw) with the fulcrum 9 away from the lighter side, the lever would be perfectly balanced.
So that's the "physical" interpretation of mean. It's a "center of mass."
Variance is a measure of the amount of "noise power." It is a measure of how much a random variable varies. For example, if there was a 100% chance of getting a 5, then the variable would not be random. It would be deterministic. It would thus have 0 variance. However, if there was only an 80% chance of getting a 5, a 10% chance of getting a 4 and a 10% chance of getting a 6, then the random variable would have a variance of 0.2 with reflects the 10% and 10% chances on the left and right of 5. In other words, 2*10%=0.2. If you change those 10% to 5%, you get a variance of 0.1 because 2*5%=0.1. The variance is somehow measuring the "spread" of the data. It's measuring the amount of noise you're going to get around your mean (the mean in this case is 5).
The standard deviation is just the square root of the variance. The variance is in square units because it is actually the "expected" squared distance from the mean. In other words, if the variance is 1, we expect that if we square the distance from the mean, we'll get a value around 1. The standard deviation just converts this expectation back into our old units. That way if we have a variance of 4, we'll have a standard deviation of 2. It's just more convenient to express variance in normal units rather than square units.
I hope that's "physical" enough.
2006-10-02 08:57:04
·
answer #2
·
answered by Ted 4
·
4⤊
1⤋
*whips out statistics book*
-Mean is the sum of the values, divided by the total number of values.
-Variance is the average of the squares of the distance that each value is from the mean.
-Standard deviation is the square root of the variance.
2006-10-02 07:37:59
·
answer #4
·
answered by Anonymous
·
0⤊
1⤋