Introduction to the Mathematical and Statistical Foundations of Econometrics

Independence of Linear and Quadratic Transformations of Multivariate Normal Random Variables

image331

Let X be distributed Nn(0, In) - that is, X is n-variate, standard, normally distributed. Consider the linear transformations Y = BX, where B is a k x n matrix of constants, and Z = CX, where C is an m x n matrix of constants. It follows from Theorem 5.4 that

Then Y and Z are uncorrelated and therefore independent if and only if CBT = O. More generally we have

Theorem 5.6: Let X be distributed Nn(0, In), and consider the linear trans­formations Y = b + BX, where b is a k x 1 vector and B a k x n matrix of constants, and Z = c + CX, where cis anm x 1 vector and C anm x n ma­trix of constants. Then Y and Z are independent if and only if BCT = O.

This result can be used to set forth conditions for independence of linear and quadratic transformations of standard normal random vectors:

Theorem 5.7: Let Xand Y be defined as in Theorem 5.6, and let Z = XTCX, where C is a symmetric n x n matrix of constants. Then Yand Z are independent ifBC = O.

Proof: First, note that the latter condition only makes sense if C is singular, for otherwise B = O. Thus, let rank (C) = m < n. We can write C = QA QT, where A is a diagonal matrix with the eigenvalues of C on the diagonal, and Q is the orthogonal matrix of corresponding eigenvectors. Let V = QTX, which is Nn(0, In) distributed because QQT = In. Because n — m eigenvalues of C are zero, we can partition Q, A, and V such that

image332Q = (Q1, Q2),

image333Z = VT A1 V1,

where A1 is the diagonal matrix with the m nonzero eigenvalues of C on the diagonal. Then

BC = B(Q1, &)(A O)(gTj = BQ1A1 QT = O

implies BQ1A1 = BQ1A1QT Q1 = O (because QT Q = In implies QT Q1 = Im), which in turn implies that BQ1 = O. The latter is a sufficient condition for the independence of V1 and Y and hence of the independence of Z and Y. Q. E.D.

Finally, consider the conditions for independence of two quadratic forms of standard normal random vectors:

Theorem 5.8: LetX ~ Nn (0, In), Z1 = XTAX, andZ2 = XTBX, where A and B are symmetric n x n matrices of constants. Then Z1 and Z 2 are independent if and only ifAB = O.

The proof of Theorem 5.8 is not difficult but is quite lengthy; it is therefore given in Appendix 5.A.

Добавить комментарий

Introduction to the Mathematical and Statistical Foundations of Econometrics

Mathematical Expectation

With these new integrals introduced, we can now answer the second question stated at the end of the introduction: How do we define the mathematical ex­pectation if the distribution of …

Hypotheses Testing

Theorem 5.19 is the basis for hypotheses testing in linear regression analysis. First, consider the problem of whether a particular component of the vector Xj of explanatory variables in model …

The Inverse and Transpose of a Matrix

I will now address the question of whether, for a given m x n matrix A, there exists an n x m matrix B such that, with y = Ax, …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.