Springer Texts in Business and Economics

Test for Over-Identification Restrictions

We emphasized instrument relevance, now we turn to instrument exogeneity. Under just - identification, one cannot statistically test instruments for exogeneity. This choice of exogenous instruments requires making an expert judgement based on knowledge of the empirical applica­tion. However, if the first structural equation is over-identified, i. e., the number of instruments £ is larger than the number of right hand side variables (g1 + k1), then one can test these over­identifying restrictions. A likelihood ratio test for this over-identification condition based on maximum likelihood procedures was given by Anderson and Rubin (1950). This version of the test requires the computation of LIML. This was later modified by Basmann (1960) so that it could be based on the 2SLS procedure. Here we present a simpler alternative based on Davidson and MacKinnon (1993) and Hausman (1983). In essence, one is testing

Ho; y = Zi61 + u1 versus Hi; y1 = Z161 + W*Y + u1 (11.47)

where u1 ~ IID(0, a11IT). Let W be the matrix of instruments of full rank £. Also, let W* be a subset of instruments W, of dimension (£ — k1 - g1), that are linearly independent of Z1 = PWZ1. In this case, the matrix [Z1, W*] has full rank £ and therefore, spans the same space as W. A test for over-identification is a test for 7 = 0. In other words, W* has no ability to explain any variation in y1 that is not explained by Z1 using the matrix of instruments W.

If W* is correlated with u1 or the first structural equation (11.34) is misspecified, say by Z1 not including some variables in W*, then 7 = 0. Hence, testing 7 = 0 should be interpreted as a joint test for the validity of the matrix of instruments W and the proper specification of (11.34) see, Davidson and MacKinnon (1993). Testing Ho; 7 = 0 can be obtained as an asymptotic F-test as follows:

Подпись: (11.48)(RRSS * — URSS *)/(£ — (g1 + ^))
URSS /(T — £)

This is asymptotically distributed as F(£ — (gi + ki),T — £) under Ho. Using instruments W, we regress Zi on W and get Zi, then obtain the restricted 2SLS estimate 6i,2SLS by regressing yi on Z. The restrictedj’esidual sum of squares from the second stage regression is RRSS* = (yi — Zi8i,2SLSy (yi — Zi61,2SLS). Next we regress yi on Zi and W* to get the unrestricted 2SLS estimates 8i>2SLS and Z2SLS. The unrestricted residual sum of squares from the second stage

regression is URSS* = (yi — ZiZi,2SLS — W*Z2SLSУ(Уі — ZiZi,2SLS — W*Z2SLs). The URSS in (11.48) is the 2SLS residuals sum of squares from the unrestricted model which is obtained as follows (yi — ZiZi,2SLS — W * Z2SLs)'(yi — ZiZi,2SLS — W *Z2SLs). URSS differs from URSS * in that Zi rather than Zi is used in obtaining the residuals. Note that this differs from the Chow-test in that the denominator is not based on URSS*, see Wooldridge (1990).

This test does not require the construction of W* for its implementation. This is because the model under Hi is just-identified with as many regressor as there are instruments. This means that its

URSS * = yi Pw yi = yi yi — yi Pw yi see problem 10. It is easy to show, see problem 12, that RRSS * = yi P^i yi = yi yi — yip^ yi where Zi = PwZi. Hence,

RRSS * — URSS * = yi Pw yi — yi P ^ yi (11.49)

The test for over-identification can therefore be based on RRSS * — URSS* divided by a con­sistent estimate of aii, say,

CUi = (Уі — ZiSi,2SLS),(yi — ZiSi,2SLS)/T (11.50)

Problem 12 shows that the resulting test statistic is exactly that proposed by Hausman (1983). In a nutshell, the Hausman over-identification test regresses the 2SLS residuals yi — Zi6i,2SLS on the matrix W of all pre-determined variables in the model. The test statistic is T times the uncentered R2 of this regression. See the Appendix to Chapter 3 for a definition of uncentered R2. This test statistic is asymptotically distributed as x2 with £ — (gi + ki) degrees of freedom. Large values of this statistic reject the null hypothesis.

Alternatively, one can get this test statistic as a Gauss-Newton Regression (GNR) on the unrestricted model in (11.47). To see this, recall from section 8.4 that the GNR applies to a general nonlinear model yt = xt(fi) + ut. Using the set of instruments W, the GNR becomes

y — x(/3) = Pw X (/3)b + residuals

where в denotes the restricted instrumental variable estimate of в under the null hypothesis and X(в) is the matrix of derivatives with typical elements Xj(в) = dx(e)/дв^ for j = 1,..., k. Thus, the only difference between this GNR and that in Chapter 8 is that the regressors are multiplied by Pw, see Davidson and MacKinnon (1993, p. 226). Therefore, the GNR for (11.47) yields

Подпись: (11.51)yi — Zi<5i;2SLS = Zibi + W *b2 + residuals

since PW[Zi, W*] = [Zi, W*] and di%2SLS is the restricted estimator under Ho; у = 0. But, [Zi, W*] spans the same space as W, see problem 12. Hence, the GNR in (11.51) is equivalent to running the 2SLS residuals on W and computing T times the uncentered R2 as described above. Once again, it is clear that W* need not be constructed.

The basic intuition behind the test for over-identification restriction rests on the fact that one can compute several legitimate IV estimators if all these instruments are relevant and exogenous. For example, suppose there are two instruments and one right hand side endogenous variable. Then one can compute two IV estimators using each instrument separately. If these IV estimators produce very different estimates, then may be one instrument or the other or both are not exogenous. The over-identification test we just described implicitly makes this comparison without actually computing all possible IV estimates. Exogenous instruments have to be uncorrelated with the disturbances. This suggests that the 2SLS residuals have to be uncorrelated with the instruments. This is the basis for the TR2a test statistic. If all the instruments are exogenous, the regression coefficient estimates should all be not significantly different from zero and the R2a should be low.

Добавить комментарий

Springer Texts in Business and Economics

The General Linear Model: The Basics

7.1 Invariance of the fitted values and residuals to non-singular transformations of the independent variables. The regression model in (7.1) can be written as y = XCC-1" + u where …

Regression Diagnostics and Specification Tests

8.1 Since H = PX is idempotent, it is positive semi-definite with b0H b > 0 for any arbitrary vector b. Specifically, for b0 = (1,0,.., 0/ we get hn …

Generalized Least Squares

9.1 GLS Is More Efficient than OLS. a. Equation (7.5) of Chap. 7 gives "ois = " + (X'X)-1X'u so that E("ois) = " as long as X and u …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.