Inconsistency and bias of the OLS estimators
Given the setup, the probability limits of b and s2, for both the structural and the functional model, are
к = plim b = p - XX‘Qp = X^p. (8.4)
Y = plim s2e = cl + P'(X3 - Х3Хх‘Х3)P > oe2. (8.5)
Hence b is inconsistent, with inconsistency equal to к - p = - XX‘Qp, and s2 is also inconsistent, the inconsistency being nonnegative since Xs - X3Xx1X3 = Q - QXX'Q > 0.
Consider the case that there is only one regressor (g = 1), which is measured with error. Then XX > Xs > 0 are scalars and к/р = XX‘X3p/p is a number between 0 and 1. So asymptotically the regression coefficient estimate is biased towards zero. This phenomenon is called attenuation. The size of the effect of the regressor on the dependent variable is underestimated.
In the multiple regression case the characterization of the attenuation is slightly more complicated. The inequality X3 - X3Xx1X3 > 0 and (8.4) together imply
p'X3p > k'Xxk (8.6)
or, using Xxk = X3 p, (P - k)'Xxk > 0. This generalizes P - к > 0 for the case g = 1 (assuming к > 0). So, given p, к is located in the half space bounded by the hyperplane P'c = k'c that includes the origin, where c = X3p = Xxk, which is a hyperplane through p perpendicular to c. It is, however, possible that к is farther from the origin than p.
The term P'X3P in (8.6) is the variance of the systematic part of the regression. The term k'Xxk is its estimate if measurement error is neglected. Thus the variance of the systematic part is underestimated. This also has a direct bearing on the properties of R2 = (b'SXb)/N y'y. This statistic converges to p2, where
k'Xxk ^ P'X3P
o2 + P'X3p “ o2 + p'X3p, and the right-hand side is the "true" R2. So the explanatory power of the model is underestimated.
When there is more than one regressor, but only one is measured with error, generally the estimators of all regression coefficients are biased. The coefficient of the error-ridden regressor (the first one, say) is biased towards zero, and the sign of the biases of the other parameters can be estimated consistently. Let ei denote the ith unit vector, pi = e1P (assumed positive), and у = e1Qe1 > 0, then the bias of the ith element of the estimator of P is e'(к - P) = - e-XX‘Qp = -^P1 ■ e-XX1e1. Thus, the first regression coefficient is underestimated whereas the signs of the
biases in the other coefficients depend on the sign of the elements of the first column of Z^1. Even when Q is unknown, these signs can be consistently estimated from the signs of the corresponding elements of Sx.