English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I've done plenty of standard deviation calculations, but I don't understand what the end result represents. Can any one help me understand?

I also have a physics problem refering to use standard deviation to find the average range. That dooesn't seem right to me. Am I misunderstanding the statement?

2006-09-29 18:49:40 · 9 answers · asked by ecogrl23 2 in Science & Mathematics Physics

9 answers

If you put data which you expect to be "normally distributed" (distributed symmetrically with a tendency to the mean) into an SD calculation, it gives you an end result which if added to and subtracted from the mean indicates a range within which about 70% of data points from that data population would lie.

For instance, if your mean is 100 and your SD is 15 then 70% of the data points will lie between 85 and 115. If your range is mean +- 2SD (ie 2SD=15x2=30, so 100+30 and 100-30) then 95% of data points would be included within 100+-30, the range 70-130.

So it's a tool which uses the data you put in to give you an end result which you can add/subract to/from the mean to give you a range within which you can predict would lie a specified proportion of data from the same population.

2006-09-29 19:18:07 · answer #1 · answered by servir tres frais 2 · 2 0

What Does Standard Deviation Measure

2016-10-06 13:42:31 · answer #2 · answered by Erika 4 · 0 0

This Site Might Help You.

RE:
What does standard deviation measure?
I've done plenty of standard deviation calculations, but I don't understand what the end result represents. Can any one help me understand?

I also have a physics problem refering to use standard deviation to find the average range. That dooesn't seem right to me. Am I...

2015-08-06 19:34:50 · answer #3 · answered by Anonymous · 0 0

One way to answer this is to think about the Gaussian distribution, or Normal distribution. Approx. 60% of all data points will lie within 1-standard deviation from the mean. So, in a sense it gives a probable range of values for some random process.

2006-09-29 19:33:11 · answer #4 · answered by entropy 3 · 1 0

The standard deviation of a set of sample values is a measure of variation of values about the mean. It is a type of average deviation of values from the mean.

2006-09-29 19:03:49 · answer #5 · answered by Anonymous · 2 0

...dunno if this helps, but the standard deviation is the rms (root-mean-square) error, or deviation from the mean, of a series of values.

sigma (s here, 'cause I don't have a Greek typeset) is related to the infamous bell curve, or normal distribution by the equation

y = ae^-(b/2), where
a=1/s*sqrt(2pi)
b=((x-m)/s)^2

+/- 3s is widely used to define tolerance in manufacturing. 97.3% of all measurements in a normal distribution fall between these limits.

The area under the curve from -0.6745s to +0.6745s is 50% of the entire area under the curve. I believe this is defined as the average range.

2006-09-29 19:21:08 · answer #6 · answered by Helmut 7 · 1 0

If instead of thinking about statistical point, you thought about grains of sugar, and you poured the sugar in a pile onto the breakfast table, the std deviation would be a measure of the neatness of your sugar pile. The smaller the number, the more compact the pile.

2006-09-29 21:25:10 · answer #7 · answered by Holden 5 · 1 0

Standard deviation is a more specific and complicate type of a mean...

2006-09-29 19:03:56 · answer #8 · answered by J C 1 · 0 2

The distance away from mean

2013-11-14 08:19:27 · answer #9 · answered by ? 1 · 0 0

yeah it just measures your regular run of the mill deviants. not the really crazy s&m ones..HaHa

2006-09-29 18:59:23 · answer #10 · answered by bigredretard2003 1 · 0 5

fedest.com, questions and answers