A COMPANION TO Theoretical Econometrics

Methods for introducing inexact nonsample information

Economists usually bring general information about parameters to the estimation problem, but it is not like the exact restrictions discussed in the previous section. For example, we may know the signs of marginal effects, which translate into inequality restrictions on parameters. Or we may think that a parameter falls in the unit interval, and that there is a good chance it falls between 0.25 and 0.75. That is, we are able to suggest signs of parameters, and even perhaps ranges of reasonable values. While such information has long been available, it has been difficult to use in applications. Perhaps the biggest breakthrough in recent years has been the development of methods and the distribution of software that makes it feasible to estimate linear (and nonlinear) models subject to inequality restric­tions, and to implement Bayesian statistical methods.

The theory of inequality restricted least squares was developed some time ago. See Judge and Yancey (1986). However, the numerical problems of minimizing the sum of squared regression errors or maximizing a likelihood function sub­ject to general inequality restrictions are substantial. Recently major software packages (SAS, GAUSS, GAMS) have made algorithms for such constrained

optimization much more accessible. With inequality restrictions, such as P* > 0, MSE gains require only that the direction of the inequality be correct.

The Bayesian paradigm is an alternative mode of thought. See Zellner (1971). In it we represent our uncertainty about parameter values using probability dis­tributions. Inexact nonsample information is specified up front in the Bayesian world, by specifying a "prior" probability distribution for each parameter (in general a joint prior). The prior density can be centered over likely values. It can be a truncated distribution, putting zero prior probability on parameter values we rule out on theoretical grounds, and so on. When prior beliefs are combined with data a multivariate probability distribution of the parameters is generated, called the posterior distribution, which summarizes all available information about the parameters.

As noted in Judge et al. (1985, p. 908), Bayesians have no special problem dealing with the singularity or near-singularity of X'X. Their approach to the collinearity problem is to combine the prior densities on the parameters, в with the sample information contained in the data to form a posterior density (see Zellner, 1971, pp. 75-81). The problem for Bayesians, as noted by Leamer (1978), is that when data are collinear the posterior distribution becomes very sensitive to changes in the prior. Small changes in the prior density result in large changes in the posterior, which complicates the use and analysis of the results in much the same way that collinearity makes inference imprecise in the classical theory of inference.

Bayesian theory is elegant, and logically consistent, but it has been a nightmare in practice. Suppose y(P) is the multivariate posterior distribution for the vector of regression parameters p. The problem is how to extract the information about a single parameter of interest, say P*. The brute force method is to obtain the posterior density for p* by integrating all the other parameters out of y(P). When the posterior distribution y(P) is complicated, as it usually is, this integration is a challenging problem.

The Bayesian miracle has been the development of computationally intensive, but logically simple, procedures for deriving the posterior densities for indi­vidual parameters. These procedures include the Gibbs sampler, the Metropolis and Metropolis-Hastings algorithms (Dorfman, 1997). These developments will soon make Bayesian analysis feasible in many economic applications.

Подпись: У r Подпись: в + Подпись: e v Подпись: (12.13)

In passing we note that non-Bayesians have tried to achieve the incorporation of similar information by making the exact restrictions in Section 5.1 inexact (Theil and Goldberger, 1961). This is achieved by adding a random disturbance v ~ (0, Q) to exact restrictions, to obtain r = - Rp + v. This additional information is combined with the linear model as

The resulting model is estimated by generalized least squares, which is called "mixed estimation" in this context. The difficulty, of course, apart from specifying
the constraints, is the specification of the covariance matrix Q, reflecting para­meter uncertainty.

Another estimation methodology has been introduced recently, based upon the maximum entropy principle (Golan, Judge, and Miller, 1996). This estimation method, instead of maximizing the likelihood function, or minimizing the sum of squared errors, maximizes the entropy function, subject to data and logical con­straints. The method of maximum entropy is "nonparametric" in the sense that no specific probability distribution for the errors need be assumed. Like the Bayesian methodology, maximum entropy estimation requires the incorporation of prior information about the regression parameters at the outset. Golan, Judge and Miller find that the maximum entropy estimator, which like the Stein-rule is a shrinkage estimator, performs well in the presence of collinearity.

Добавить комментарий

A COMPANION TO Theoretical Econometrics

Normality tests

Let us now consider the fundamental problem of testing disturbance normality in the context of the linear regression model: Y = Xp + u, (23.12) where Y = (y1, ..., …

Univariate Forecasts

Univariate forecasts are made solely using past observations on the series being forecast. Even if economic theory suggests additional variables that should be useful in forecasting a particular variable, univariate …

Further Research on Cointegration

Although the discussion in the previous sections has been confined to the pos­sibility of cointegration arising from linear combinations of I(1) variables, the literature is currently proceeding in several interesting …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.