When dealing with ststistics, there are many types of deviation, population, sample et.. Deviation is the measure of the typical amountb an entry deviates from the mean. The more entry are spread out , the greater the standard deviation.
2007-02-27 03:49:10
·
answer #1
·
answered by Rudeline C 1
·
0⤊
0⤋
Standard deviation tells you how spread out a group of numbers is. Let's say you're in charge of quality control, and you have a numerical way of evaluating how good your product is in terms of quality.
Imagine that you measure quality on a scale of 1-100, and your ideal is 70 (it's a pretend exercise, so we can have a usually bad product). Your average quality is, say, 75.
That's great, right?
Not really - you only know the average, but not how far away from the average your quality numbers really are.
If you add in standard deviation, you find that standard deviation is, say, 5, that's not bad at all - 2/3 of all results in a sample fall within one standard deviation of the mean (that's part of the definition of standard deviation), so 2/3 of your products measure between 70 and 80 on your quality scale.
But, what if your standard deviation is 15? That means that a huge portion of your products fall below your minimum score of 70. And that means you build crappy products.
So standard deviation helps us to know how accurate our measurements are relative to the mean, and how spread out a sample is.
2007-02-27 11:44:05
·
answer #2
·
answered by Brian L 7
·
1⤊
0⤋
Also, when the standard deviation is combined with the mean it gives a discription of a normal curve. It can be used in the computation of the standard z-score which in turn is used to figure percentiles and compare different scores on different tests.
2007-02-27 11:45:48
·
answer #3
·
answered by PKM 2
·
0⤊
0⤋
In probability and statistics, the standard deviation of a probability distribution, random variable, or population or multiset of values is a measure of the spread of its values. It is defined as the square root of the variance. Simply put, it is a measure of how much individual elements tend to deviate from the average.
2007-02-27 11:43:52
·
answer #4
·
answered by I am an Indian 4
·
0⤊
0⤋
It measures the dispersion to show the spread of data. Standard deviation equals the square root of the varience.
2007-02-27 11:43:40
·
answer #5
·
answered by rjphillips246 2
·
0⤊
0⤋
In a simple way, it's a measure of how much the values of a list get far from the average value. It's useful for knowing how much they vary from the average.
2007-02-27 11:42:29
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋
GOOD QUESTION
I AM AFRAID I HAVE NO ANSWER...
2007-02-27 15:25:21
·
answer #7
·
answered by MARC H 2
·
0⤊
1⤋