English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Econometrics professor said when selecting the order of the AR proccess in testing for unit roots with ADF in Microfit, one should select the mode with the *highest* (as opposed to the lowest, as usually) value of the Akaike Info Criterion of the Schwarz criterion, because Microfit performs some extra calculation that makes it assign a higher value to the regression that would normally carry the smallest.

How true is this? Help much appreciated and please cite your sources.

2007-03-19 09:49:15 · 2 answers · asked by Paul 1 in Social Science Economics

2 answers

i personally havent used microfit. but i have been taught that you should select the order of an autoregressive model that lowers the akaike and shwarz criterion. i dont know why but my professor told us that if one criterion goes lower and one goes higher when adding up a lag .then to use the shwarz one over the aic. (i dont know the reason for that) back to your question i havent used microfit but i find very strange that your professor told you to chose the highest value of the aic. i know aic and sic, both have formulas (wich i dont remember) to be calculated so i dont think microfit uses a different formula than any other programs. the only reasonable thing i can think is that maybe microfit gives you the negative value of the info criterion so the relationship changes to be used as the R squared which is better if higher, but i really dont see a practical use of that. So i recommend you to check the help menu of microfit to see how the program calculates the aic and sic, and compare it with the formula from your books.

2007-03-21 11:08:41 · answer #1 · answered by ganapan7 3 · 0 0

In the general case, the AIC is AIC=2k-2ln(L)


where k is the number of parameters, and L is the likelihood function.

Over the remainder of this entry, it will be assumed that the model errors are normally distributed. Let n be the number of observations and RSS be the residual sum of squares. Then AIC becomes
AIC=2k+nln(RSS/n)


Increasing the number of free parameters to be estimated improves the goodness of fit, regardless of the number of free parameters in the data generating process. Hence AIC not only rewards goodness of fit, but also includes a penalty that is an increasing function of the number of estimated parameters. This penalty discourages overfitting. The preferred model is the one with the lowest AIC value. The AIC methodology attempts to find the model that best explains the data with a minimum of free parameters. By contrast, more traditional approaches to modeling start from a null hypothesis. The AIC penalizes free parameters less strongly than does the Schwarz criterion.

2007-03-19 15:25:15 · answer #2 · answered by Santa Barbara 7 · 0 0

fedest.com, questions and answers