A COMPANION TO Theoretical Econometrics

# The Nature and Statistical Consequences of Collinearity

Consider first a linear regression model with two explanatory variables,

У t = P1 + P2X2 + P3X3 + e,

Assume that the errors are uncorrelated, with mean zero and constant variance, a2, and that xt2 and xt3 are nonstochastic. Under these assumptions the least squares estimators are the best, linear, unbiased estimators of the regression parameters. The variance of the least squares estimator b2 of p2 is

a2

X(Xt2 - Z2)2(1 - r 23)

t=1 where z2 is the sample mean of the T observations on xt2, and r23 is the sample correlation between xt2 and xt3. The formula for the variance of b3, the least squares estimator of p3, is analogous, but the variance of the intercept estimator is messier and we will not discuss it here. The covariance between b2 and b3 is

(12.3)

The variance and covariance expressions reveal the consequences of two of the three forms of collinearity. First, suppose that xt2 exhibits little variation about its sample mean, so that X(xt2 - z2)2 is small. The less the variation in the explanatory variable xt2 about its mean, the larger will be the variance of b2, and the larger will be the covariance, in absolute value, between b2 and b3. Second, the larger the cor­relation between xt2 and xt3 the larger will be the variance of b2, and the larger will be the covariance, in absolute value, between b2 and b3. If the correlation is posi­tive the covariance will be negative. This is the source of another conventional
observation about collinearity, namely that the coefficients of highly correlated variables tend to have opposite signs.

Exact, or perfect, collinearity occurs when the variation in an explanatory vari­able is zero, X(xt2 - x2)2 = 0, or when the correlation between xt2 and xt3 is perfect, so that r23 = ±1. In these cases the least squares estimates are not unique, and, in absence of additional information, best linear unbiased estimators are not avail­able for all the regression parameters. Fortunately, this extreme case rarely occurs in practice.

The commonly cited symptoms of collinearity, that least squares estimates have the wrong sign, are sensitive to slight changes in the data or the model specification, or are not statistically significant, follow from the large variances of the least squares estimators. The least squares estimators are unbiased under standard assumptions, so that E[bk] = pk, but how close an estimate might be to the true parameter value is determined by the estimator variance. Large variances for estimators imply that their sampling (probability) distributions are wide, meaning that in any particular sample the estimates we obtain may be far from the true parameter values.

Добавить комментарий

## A COMPANION TO Theoretical Econometrics

### Normality tests

Let us now consider the fundamental problem of testing disturbance normality in the context of the linear regression model: Y = Xp + u, (23.12) where Y = (y1, ..., …

### Univariate Forecasts

Univariate forecasts are made solely using past observations on the series being forecast. Even if economic theory suggests additional variables that should be useful in forecasting a particular variable, univariate …

### Further Research on Cointegration

Although the discussion in the previous sections has been confined to the pos­sibility of cointegration arising from linear combinations of I(1) variables, the literature is currently proceeding in several interesting …

## Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия
+38 050 512 11 94 — гл. инженер-менеджер (продажи всего оборудования)

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

## Контакты для заказов шлакоблочного оборудования:

+38 096 992 9559 Инна (вайбер, вацап, телеграм)
Эл. почта: inna@msd.com.ua