Using gret l for Principles of Econometrics, 4th Edition
Information Criteria
The two model selection rules considered here are the Akaike Information Criterion (AIC) and the Schwarz Criterion (SC). The SC is sometimes called the Bayesian Information Criterion (BIC). Both are computed by default in gretl and included in the standard regression output. The values that gretl reports are based on maximizing a log-likelihood function (normal errors). There are other variants of these that have been suggested for use in linear regression and these are presented in the equations below:
BIC = SC = ln( SSE/N) + K ln(N )/N (6.10)
The rule is, compute AIC or SC for each model under consideration and choose the model that minimizes the desired criterion. The models should be evaluated using the same number of observations, i. e., for the same value of N. You can convert the ones gretl reports to the ones in (6.9) using a simple transformation; add (1 + ln(2n)) and then multiply everything by N. Since sample size should be held constant when using model selection rules, you can see that the two different computations will lead to exactly the same model choice.
Since the functions have to be evaluated for each model estimated, it is worth writing a function in gretl that can be reused. The use of functions to perform repetitive computations makes programs shorter and reduced errors (unless your function is wrong, in which case every computation
is incorrect!) In the next section, I will introduce you to gretl functions and offer one that will compute the three model selection rules discussed above.