English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

So you have, let's say, a series S you want to estimate. You built a model to estimate S. Let that series be S'. You want at any point in time, S(t)=S'(t). But, over a longer period, you find they are completely off. The model is not that great. Then you find things missing in the model and you add those to it, and you get your new S'', which should now follow S more closely. However, sometimes S'' is off too, even if it is very close. Let's say S'' =S'+A+B+C, and S" does a better job than S'. By keeping track of S, S', S", A,B and C, and looking at a chart graphing them all or running regressions, how can you determine where the difference in S and S" goes? Should you add something else to S", or perhaps one of the A,B,C is missing something??...I am not sure.

2006-08-18 00:53:27 · 2 answers · asked by Ani 1 in Business & Finance Other - Business & Finance

2 answers

Finite models for infinite series, by definition, will eventually diverge. It may be necessary to construct two separate models. You can use a converging (Taylor) series for small values of t, and an asymptotic series for large values. Or, more generally, just figure out where the divergence from your original model becomes unacceptable, and then construct a different model from that point. Define your model in piecewise fashion, using as many segments as you need in your domain.

2006-08-18 01:00:59 · answer #1 · answered by DavidK93 7 · 0 0

It is unrealistic to expect that S(t) = S'(t). It is more realistic to expect that S'(t) = E[S(t)] (that is, the predicted value = the expected value of the function), and the variance of your errors to be at a minimum. Usually, if an estimator is BLU (minimum variance, linear, and unbiased), we are happy.

2006-08-18 08:29:40 · answer #2 · answered by Jamestheflame 4 · 0 0

fedest.com, questions and answers