Introduction to the Mathematical and Statistical Foundations of Econometrics

Eigenvalues and Eigenvectors of Symmetric Matrices

On the basis of (I.60) it is easy to show that, in the case of a symmetric matrix A, в = 0 and b = 0:

Theorem I.34: The eigenvalues of a symmetric n x n matrix A are all real valued, and the corresponding eigenvectors are contained in Кп.

Proof: First, note that (I.60) implies that, for arbitrary Ц є К,

/ bT / A — a In в In (a

0 £a) —в In A — a iJ bJ

= Ц aT Ab + bT Aa — abTa — ЦaaTb + в bTb — ^a^a.

Next observe that bTa = aTb and by symmetry, bTAa = (bTAa)T = aTATb = aT Ab, where the first equality follows because bT Aa is a scalar (or1 x 1 matrix). Then we have for arbitrary Ц є К,

(Ц + 1)aT Ab — a(t; + 1)aTb + в (bTb — Ц aTa) = 0. (I.61)

If we choose Ц = —1 in (I.61), then в(bTb + aTa) = в ■ x ||2 = 0; conse­quently, в = 0 and thus X = a є К. It is now easy to see that b no longer matters, and we may therefore choose b = 0. Q. E.D.

There is more to say about the eigenvectors of symmetric matrices, namely,

14 Recall (see Appendix III) that the length (or norm) of a complex number x = a + i ■ b, a, b є К, is defined as |x| = ^(a + i ■ b) ■ (a — i ■ b) = Va2 + b2. Similarly, in the vector case x = a + i ■ b, a, b є Кп, the length of x is defined as ||x|| = у/(a + i ■ b)T(a — i ■ b) = Va1 a + bTb.

Theorem I.35: The eigenvectors of a symmetric n x n matrix A can be chosen orthonormal.

Proof: First assume that all the eigenvalues k1,k2,...,k„ of A are different. Let x1, x2,...,xn be the corresponding eigenvectors. Then for i = j, x] Axj =

кjxjxj andxjAxi = kixjxj; hence, (k - kj)xjxj = 0 because, by symmetry,

xT Axj = (xT Axj )T = xjATxi = xjAxi.

Because ki = к j, it follows now that xjxj = 0. Upon normalizing the eigen­vectors as qj = \xj ||-1 xj, we obtain the result.

The case in which two or more eigenvalues are equal requires a com­pletely different proof. First, normalize the eigenvectors as qj = \xj ||-1 xj. Using the approach in Section I.10, we can always construct vectors y2,..., yn є К” such that q1, y2,..., yn is an orthonormal basis of Rn. Then Q1 = (q1, y2,..., yn) is an orthogonal matrix. The first column of QTAQ1 is QTAq1 = к QTq1. But by the orthogonality of Q1, qTQ1 = qT(q1, y2,..., yn) = (q1Tq1, qTy2,..., qTyn) = (1, 0, 0,..., 0); hence, the first col­umn of QTAQ1 is (k1, 0, 0,..., 0)T and, by symmetry of QTAQ1, the first row is (k1,0, 0,..., 0). Thus, QTAQ1 takes the form

Подпись: QT AQ1k1 0M

0 An-1 /

Next, observe that

det (QTAQ1 - kIn) = det (QTAQ1 - kQTQ1)

= det (Qt)A - kIn)Q1]

= det (QT) det(A - k In) det( Q1) = det(A - kIn),

Подпись: k2 0T . 0 An-2
Подпись: Q2T An-1 Q 2

and thus the eigenvalues of QTAQ1 are the same as the eigenvalues of A; consequently, the eigenvalues of An-1 are k2,...,kn. Applying the preceding argument to An-1, we obtain an orthogonal (n - 1) x (n - 1) matrix Q| such that

Hence, letting

a=(0 Ql

which is an orthogonal n x n matrix, we can write

Подпись: Ql QT AQ1Q2 ="Л2 O

O An-2,
where Л2 is a diagonal matrix with diagonal elements к and к2. Repeating this procedure n — 3 more times yields

QT... QTQTAQ1Q2... Qn = Л,

where Л2 is the diagonal matrix with diagonal elements kt, k2,...,kn.

Note that Q = Q t Q2 ... Qn, is an orthogonal matrix itself, and it is now easy to verify that the columns of Q are the eigenvectors of A. Q. E.D.

In view of this proof, we can now restate Theorem I.35 as follows:

Theorem I.36: A symmetric matrix A can be written as A = QЛ QT, where Л is a diagonal matrix with the eigenvalues of A on the diagonal and Q is the orthogonal matrix with the corresponding eigenvectors as columns.

This theorem yields several useful corollaries. The first one is trivial:

Theorem I.37: The determinant of a symmetric matrix is the product of its eigenvalues.

The next corollary concerns idempotent matrices (see Definition I.12):

Theorem I.38: The eigenvalues of a symmetric idempotent matrix are either 0 or 1. Consequently, the only nonsingular symmetric idempotent matrix is the unit matrix I.

Proof: Let the matrix A in Theorem I.36 be idempotent: A ■ A = A. Then, A = QЛ QT = A ■ A = QЛ QTQЛ QT = QЛ2 QT; hence, Л = Л2. Because Л is diagonal, each diagonal element к j satisfies к j = к2; hence, к j (1 — к j) = 0. Moreover, if A is nonsingular and idempotent, then none of the eigenvalues can be zero; hence, they are all equal to 1: Л = I. Then A = QIQT = A = QQT = I. Q. E.D.

Добавить комментарий

Introduction to the Mathematical and Statistical Foundations of Econometrics

Mathematical Expectation

With these new integrals introduced, we can now answer the second question stated at the end of the introduction: How do we define the mathematical ex­pectation if the distribution of …

Hypotheses Testing

Theorem 5.19 is the basis for hypotheses testing in linear regression analysis. First, consider the problem of whether a particular component of the vector Xj of explanatory variables in model …

The Inverse and Transpose of a Matrix

I will now address the question of whether, for a given m x n matrix A, there exists an n x m matrix B such that, with y = Ax, …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.