Introduction to the Mathematical and Statistical Foundations of Econometrics
Independence of Linear and Quadratic Transformations of Multivariate Normal Random Variables
Let X be distributed Nn(0, In) - that is, X is n-variate, standard, normally distributed. Consider the linear transformations Y = BX, where B is a k x n matrix of constants, and Z = CX, where C is an m x n matrix of constants. It follows from Theorem 5.4 that
Then Y and Z are uncorrelated and therefore independent if and only if CBT = O. More generally we have
Theorem 5.6: Let X be distributed Nn(0, In), and consider the linear transformations Y = b + BX, where b is a k x 1 vector and B a k x n matrix of constants, and Z = c + CX, where cis anm x 1 vector and C anm x n matrix of constants. Then Y and Z are independent if and only if BCT = O.
This result can be used to set forth conditions for independence of linear and quadratic transformations of standard normal random vectors:
Theorem 5.7: Let Xand Y be defined as in Theorem 5.6, and let Z = XTCX, where C is a symmetric n x n matrix of constants. Then Yand Z are independent ifBC = O.
Proof: First, note that the latter condition only makes sense if C is singular, for otherwise B = O. Thus, let rank (C) = m < n. We can write C = QA QT, where A is a diagonal matrix with the eigenvalues of C on the diagonal, and Q is the orthogonal matrix of corresponding eigenvectors. Let V = QTX, which is Nn(0, In) distributed because QQT = In. Because n — m eigenvalues of C are zero, we can partition Q, A, and V such that
Q = (Q1, Q2),
Z = VT A1 V1,
where A1 is the diagonal matrix with the m nonzero eigenvalues of C on the diagonal. Then
BC = B(Q1, &)(A O)(gTj = BQ1A1 QT = O
implies BQ1A1 = BQ1A1QT Q1 = O (because QT Q = In implies QT Q1 = Im), which in turn implies that BQ1 = O. The latter is a sufficient condition for the independence of V1 and Y and hence of the independence of Z and Y. Q. E.D.
Finally, consider the conditions for independence of two quadratic forms of standard normal random vectors:
Theorem 5.8: LetX ~ Nn (0, In), Z1 = XTAX, andZ2 = XTBX, where A and B are symmetric n x n matrices of constants. Then Z1 and Z 2 are independent if and only ifAB = O.
The proof of Theorem 5.8 is not difficult but is quite lengthy; it is therefore given in Appendix 5.A.