Springer Texts in Business and Economics

Testing Linear Restrictions

In the simple linear regression chapter, we proved that the OLS estimates are BLUE provided assumptions 1 to 4 were satisfied. Then we imposed normality on the disturbances, assumption 5, and proved that the OLS estimators are in fact the maximum likelihood estimators. Then we derived the Cramer-Rao lower bound, and proved that these estimates are efficient. This will be done in matrix form in Chapter 7 for the multiple regression case. Under normality one can test hypotheses about the regression. Basically, any regression package will report the OLS esti­mates, their standard errors and the corresponding t-statistics for the null hypothesis that each individual coefficient is zero. These are tests of significance for each coefficient separately. But one may be interested in a joint test of significance for two or more coefficients simultaneously, or simply testing whether linear restrictions on the coefficients of the regression are satisfied. This will be developed more formally in Chapter 7. For now, all we assume is that the reader can perform regressions using his or her favorite software like EViews, Stata, SAS, TSP, SHAZAM, LIMDEP or GAUSS. The solutions to (4.2) or (4.3) result in the OLS estimates. These multiple regression coefficient estimates can be interpreted as simple regression estimates as shown in section 4.3. This allows a simple derivation of their standard errors. Now, we would like to use these regressions to test linear restrictions. The strategy followed is to impose these restrictions on the model and run the resulting restricted regression. The corresponding Restricted Residual

Подпись: F image116 Подпись: (4.17)

Sums of Squares is denoted by RRSS. Next, one runs the regression without imposing these linear restrictions to obtain the Unrestricted Residual Sums of Squares, which we denote by URSS. Finally, one forms the following F-statistic:

where £ denotes the number of restrictions, and n — K gives the degrees of freedom of the unrestricted model. The idea behind this test is intuitive. If the restrictions are true, then the RRSS should not be much different from the URSS. If RRSS is different from URSS, then we reject these restrictions. The denominator of the F-statistic is a consistent estimate of the unrestricted regression variance. Dividing by the latter makes the F-statistic invariant to units of measurement. Let us consider two examples:

Example 2: Testing the joint significance of two regression coefficients. For e. g., let us test the following null hypothesis Ho; 32 = 3 з = 0. These are two restrictions /32 = 0 and 33 = 0 and they are to be tested jointly. We know how to test for 32 = 0 alone or 33 = 0 alone with individual f-tests. This is a test of joint significance of the two coefficients. Imposing this restriction, means the removal of X2 and X3 from the regression, i. e., running the regression of Y on X4,..., XK excluding X2 and X3. Hence, the number of parameters to be estimated becomes (K — 2) and the degrees of freedom of this restricted regression are n — (K — 2). The unrestricted regression is the one including all the X’s in the model. Its degrees of freedom are (n — K). The number of restrictions are 2 and this can also be inferred from the difference between the degrees of freedom of the restricted and unrestricted regressions. All the ingredients are now available for computing F in (4.17) and this will be distributed as F2,n-K.

Example 3: Test the equality of two regression coefficients H0; 33 = 34 against the alternative that Hi; 33 = 34. Note that H0 can be rewritten as H0; 33 — 34 = 0. This can be tested using a f-statistic that tests whether d = 33 — 34 is equal to zero. From the unrestricted regression, we can obtain d = 33 — 34 with var(d) = var(33)+var(34) — 2cov(33,34). The variance-covariance matrix of the regression coefficients can be printed out with any regression package. In section 4.3, we gave these variances and covariances a simple regression interpretation. This means

that se(d) = ^/var(d) and the f-statistic is simply f = (d — 0)/se(d) which is distributed as fn-K under H0. Alternatively, one can run an F-test with the RRSS obtained from running the following regression

Yi = a + 32X2i + 33i(X3i + X4i) + 35X5i + .. + 3 K XKi + ui

with 33 = 34 substituted in for 34. This regression has the variable (X3i + X4i) rather than X3i and X4i separately. The URSS is the regression of Y on all the X’s in the model. The degrees of freedom of the resulting F-statistic are 1 and n — K. The numerator degree of freedom states that there is only one restriction. It will be proved in Chapter 7 that the square of the f-statistic is exactly equal to the F-statistic just derived. Both methods of testing are equivalent. The first one computes only the unrestricted regression and involves some further variance computations, while the latter involves running two regressions and computing the usual F-statistic.

Example 4: Test the joint hypothesis H0; 33 = 1 and 32 — 234 = 0. These two restrictions are usually obtained from prior information or imposed by theory. The first restriction is 33 = 1.

The value 1 could have been any other constant. The second restriction shows that a linear combination of 32 and 34 is equal to zero. Substituting these restrictions in (4.1) we get

Yi = a + 32X2i + X3i + 232X4i + 35X5i + •• + 3K XKi + ui which can be written as

Yi — X3i = a + 32(X2i + 2 X4i) + 35X5i + •• + 3K XKi + ui

Therefore, the RRSS can be obtained by regressing (Y — X3) on (X2 + 1X4),X5, •••, XK. This regression has n — (K — 2) degrees of freedom. The URSS is the regression with all the X’s included. The resulting F-statistic has 2 and n — K degrees of freedom.

Example 5: Testing constant returns to scale in a Cobb-Douglas production function. Q = AK aLe EY M 6 eu is a Cobb-Douglas production function with capital(K), labor(L), energy(E) and material(M). Constant returns to scale means that a proportional increase in the inputs pro­duces the same proportional increase in output. Let this proportional increase be A, then K* = XK, L* = AL, E* = AE and M* = AM. Q* = A(a+e+Y+<5) AKaLeEYM6eu = A(a+e+Y+S)Q. For this last term to be equal to AQ, the following restriction must hold: a + 3 + Y + 8 = 1. Hence, a test of constant returns to scale is equivalent to testing Ho; a + 3 + Y + 8 = 1. The Cobb-Douglas production function is nonlinear in the variables, and can be linearized by taking logs of both sides, i. e.,

logQ = logA + alogK + 3 logL + YlogE + 8logM + u (4.18)

This is a linear regression with Y = logQ, X2 = logK, X3 = logL, X4 = logE and X5 = logM. Ordinary least squares is BLUE on this non-linear model as long as u satisfies assumptions 1-4. Note that these disturbances entered the original Cobb-Douglas production function mul­tiplicatively as exp(ui). Had these disturbances entered additively as Q = AKaLeEYM6 + u then taking logs does not simplify the right hand side and one has to estimate this with non­linear least squares, see Chapter 8. Now we can test constant returns to scale as follows. The unrestricted regression is given by (4.18) and its degrees of freedom are n — 5. Imposing H0 means substituting the linear restriction by replacing say 3 by (1 — a — y — 8). This results after collecting terms in the following restricted regression with one less parameter

log(Q/L) = logA + alog(K/L) + Ylog(E/L) + 8log(M/L) + u (4.19)

The degrees of freedom are n — 4. Once again all the ingredients for the test in (4.17) are there and this statistic is distributed as F2,n — 5 under the null hypothesis.

Example 6: Joint significance of all the slope coefficients. The null hypothesis is Ho; 32 = 3 з = •• = 3k = 0

against the alternative H2; at least one 3k = 0 for k = 2,„,K. Under the null, only the constant is left in the regression. Problem 3.2 showed that for a regression of Y on a constant only, the least squares estimate of a is F. This means that the corresponding residual sum of squares is ^^(Y—F)2. Therefore, RRSS = Total sums of squares of regression (4.1) = T™=1y2.

The URSS is the usual residual sums of squares En=i e2 from the unrestricted regression given by (4.1). Hence, the corresponding F-statistic for H0 is

(TSS- RSS)/(K-1) (EHiy2 - Etie2)/(K-1) = n-K

RSS/(n - K) ЕГ=1 e2/(n - K) 1 - R2 K - 1 (.)

where R2 = 1 - (ЕП=1 e2/J27=1 Vi ). This F-statistic has (K -1) and (n - K) degrees of freedom under H0, and is usually reported by regression packages.

Добавить комментарий

Springer Texts in Business and Economics

The General Linear Model: The Basics

7.1 Invariance of the fitted values and residuals to non-singular transformations of the independent variables. The regression model in (7.1) can be written as y = XCC-1" + u where …

Regression Diagnostics and Specification Tests

8.1 Since H = PX is idempotent, it is positive semi-definite with b0H b > 0 for any arbitrary vector b. Specifically, for b0 = (1,0,.., 0/ we get hn …

Generalized Least Squares

9.1 GLS Is More Efficient than OLS. a. Equation (7.5) of Chap. 7 gives "ois = " + (X'X)-1X'u so that E("ois) = " as long as X and u …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.