Introduction to the Mathematical and Statistical Foundations of Econometrics

Expectations of Products of Independent Random Variables

Let X and Y be independent random variables, and let f and g be Borel - measurable functions on R. I will show now that

E [f(X)g(Y)] = (E [f(X)])(E [g(Y)]). (2.30)

In general, (2.30) does not hold, although there are cases in which it holds for dependent X and Y. As an example of a case in which (2.30) does not hold, let X = U0 ■ U1 and X = U0 ■ U2, where U0, U1, and U2 are independent and

uniformly [0, 1] distributed, and let f (x) = x, g(x) = x. The joint density of

U0, U1 and U2 is

h(u0, u1, u2) = 1 if (u0, u 1, u2)T є [0, 1] x [0, 1] x [0, 1], h(u0, u1, u2) = 0 elsewhere;

hence,

Подпись: E [f (X) g(Y)]E[X ■ Y] = E[ Uo U1 U2 ]

1 1 1

/// u0u1U2du0 du1 du2

Подпись: 000 1

J u0 du0 J u1du1 J u2du2

000

(1/3) x (1/2) x (1/2) = 1/12,

image082

whereas

 

image083

E [f(X)] = E [X]

 

00

 

and similarly, E [g(Y)] = E [Y] = 1/4.

As an example of dependent random variables X and Y for which (2.30) holds, now let X = U0(U1 - 0.5) and Y = U0(U2 - 0.5), where U0, U1, and U2 are the same as before, and again f (x) = x, g(x) = x. Then it is easy to show that E[X ■ Y] = E[X] = E[Y] = 0.

To prove (2.30) for independent random variables X and Y, let f and g be simple functions, f (x) = J2Г=1 aiI(x є Ai), g(x) = YTj=1 PjI(x є Bj), where the Ai’s are disjoint Borel sets and the Bj’s are disjoint Borel sets. Then

 

image084

= ai P({ш є й : X(ш) є Ai})

 

j

 

х ^Е Pj P({ш є й : Y(ш) є Bj})

= (E [ f (X)])(E [g(Y)])

 

because, by the independence of X and Y, P(X є Ai and Y є Bj) = P(X є Ai)P(Y є Bj). From this result the next theorem follows more generally:

 

Theorem 2.20: LetXand Ybe random vectors in Rp andR?, respectively. Then X and Y are independent if and only if E [f (X)g(Y)] = (E [f (X)])(E [g(Y)]) for all Borel-measurable functions fand g on Кp and R?, respectively, for which the expectations involved are defined.

This theorem implies that independent random variables are uncorrelated. The reverse, however, is in general not true. A counterexample is the case I have considered before, namely, X = Uo(Ui - 0.5) and Y = U0(U2 — 0-5), where U0, U1, and U2 are independent and uniformly [0, 1] distributed. In this case, E[X ■ Y] = E[X] = E[Y] = 0; hence, cov(X, Y) = 0, but X and Y are dependent owing to the common factor U0. The latter can be shown formally in different ways, but the easiest way is to verify that, for example, E[X2 ■ Y2] = (E[X2])(E[Y2]), and thus the dependence of X and Y follows from Theorem 2.20.

Добавить комментарий

Introduction to the Mathematical and Statistical Foundations of Econometrics

Mathematical Expectation

With these new integrals introduced, we can now answer the second question stated at the end of the introduction: How do we define the mathematical ex­pectation if the distribution of …

Hypotheses Testing

Theorem 5.19 is the basis for hypotheses testing in linear regression analysis. First, consider the problem of whether a particular component of the vector Xj of explanatory variables in model …

The Inverse and Transpose of a Matrix

I will now address the question of whether, for a given m x n matrix A, there exists an n x m matrix B such that, with y = Ax, …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.