What are the various confidence intervals for 2 Standard Errors v. 2 Standard Deviations from a Linear Regression Trendline or Simple Moving Average?
For instance, if I determine the value that is Two Standard Errors from a 50-day Linear Regression Trendline and I determine the value that is Two Standard Deviations from that same 50-day Linear Regression Trendline, they can't both have a Confidence Interval of 95% if they are dramatically different values; can they? By way of example, in most stock trading software, Standard Error Bands are used most often in combination with a Linear Regression Trendline and Standard Deviation Bands are used most often in conjunction with a Simple Moving Average. Can someone explain why this might be and what the different confidence intervals represent?
2006-10-12
12:34:26
·
1 answers
·
asked by
Anonymous
in
Mathematics