I have one column of data for 3,500 days (Column A). I created a second column of numbers, 1 to 3500 (Column B).
In Excel, I then entered the formulae =RSQ(A1:A3,B1:B3) and =RSQ(A1:A4,B1:B4) etc. up to 100 days. As of today, the Linear Regression Trendline (LRT) that best fits my data is a 91-day LRT. It has an R-squared value of .9685.
To set outer bands for tomorrow's possible data, should I use "Standard Error" (SE) or "Standard Deviation" (SD)? After brief backtests, if I have a LRT with an R-squared > .9, my data normally reverses if it gets near 2 SE from that LRT unless the LRT is for a short time period (i.e., between 3 and 10 days). If I use SD, however, it appears that my data rarely gets anywhere near 2 SD from a LRT with an R-squared > .9--unless a short time period (i.e. 3-10 days).
If I should use SE, and I multiply the SE for the last 91 days by a z-value of 1.96, is the confidence interval (CI) 95%? If not, how can I find a CI of 95% (like 2SD) for SE value?
2006-11-19
10:14:34
·
2 answers
·
asked by
Anonymous
in
Science & Mathematics
➔ Mathematics