INTRODUCTION TO STATISTICS AND ECONOMETRICS

Bivariate Random Variables

The last row in Example 3.2.2 shows the values taken jointly by two random variables X and Y. Since a quantity such as (1, 1) is not a real number, we do not have a random variable here as defined in Definition 3.2.1. But it is convenient to have a name for a pair of random variables put together. Thus we have

DEFINITION 3.2.2 A bivariate discrete random variable is a variable that takes a countable number of points on the plane with certain probabilities.

The probability distribution of a bivariate random variable is deter­mined by the equations P(X = xt, Y = yj) = рц, і = 1, 2, . . . , n, j = 1, 2,

. . . , m. We call pij the joint probability; again, n and/or m may be °° in some cases.

When we have a bivariate random variable in mind, the probability distribution of one of the univariate random variables is given a special name: marginal probability distribution. Because of probability axiom (3) of Section 2.2, the marginal probability is related to the joint probabilities by the following relationship.

Marginal probability

m

P(X = Xi) = X P(X = x0Y = yj), і = 1, 2, . . . , n.

;=i

Using Theorem 2.4.1, we can define

Conditional probability

. P(X = X:,Y = yj)

P(X = хг I Y = yj) = У]> if P(Y = yj) > 0.

P{Y = yj)

In Definition 2.4.1 we defined independence between a pair of events. Here we shall define independence between a pair of two discrete random variables.

definition 3.2.3 Discrete random variables are said to be independent if the event (X = Xj) and the event (F = yj) are independent for all i, j. That is to say, P(X = xt, Y = yj) = P(X = xt)P{Y = yj) for all i, j.

Подпись: TABLE 3.1 Probability distribution of a bivariate random variable y X Уг У2 Ут X] pu P2 Plm Plo x2 Pn p22 P2m p2 о xn Pnl Pn2 Pnm pno Pol Po 2 Pom

It is instructive to represent the probability distribution of a bivariate random variable in an n X m table. See Table 3.1. Affixed to the end of Table 3.1 are a column and a row representing marginal probabilities calculated by the rules pl0 = Х™=ру and p0j = ipt]. (The word marginal comes from the positions of the marginal probabilities in the table.) By looking at the table we can quickly determine whether X and Y are independent or not according to the following theorem.

THEOREM 3.2.1 Discrete random variables X and Y with the probability distribution given in Table 3.1 are independent if and only if every row is proportional to any other row, or, equivalently, every column is propor­tional to any other column.

Proof, (“only if” part). Consider, for example, the first two rows. We have

Подпись: for every j.Подпись: (3.2.1) If X and Ру _ P(xі I ypP(jj) _ P(xi I yj) Pv P(x2 І yj)P(yj) P(x2 I yj)

Y are independent, we have by Definition 3.2.3

P(xі I yj) = P(xj) P(x2 I yj) P{x2)

which does not depend on j. Therefore, the first two rows are proportional to each other. The same argument holds for any pair of rows and any pair of columns.

(“if’ part). Suppose all the rows are proportional to each other. Then from (3.2.1) we have

(3.2.3) P(xt I yj) = cik • P(xk I yj) for every i, k, and j.

Multiply both sides of (3.2.3) by P(yj) and sum over j to get

(3.2.4) Р{хі) = cik ’ P(Xk) f°r every і and k.

From (3.2.3) and (3.2.4) we have

P(Xi I yj) P(xk yj)

(3.2.5) =--------------- for every і, and k.

P(Xi) P(xk)

Therefore

(3.2.6) Р(хі I y}) = Cj ■ P(Xi) for every і and j.

Summing both sides over i, we determine Cj to be unity for every j. Therefore X and Y are independent. □

We shall give two examples of nonindependent random variables.

EXAMPLE 3.2.3 Let the joint probability distribution of X and Y be given by

*4

і

0

1

%

y8

%

0

2/s

%

%

4/s

4/s

Then we have P(Y = 1 | X = 1) = (%)/(%) = % and P(Y = 1 | X = 0) = (%)/ (%) = %. which shows that X and Y are not independent.

example 3.2.4 Random variables X and Y defined below are not inde­pendent, but X2 and Y2 are independent.

P(X = 1) = p, 0<p<

P(X = 0) = 1 - p

P(Y = 1 I X = 1) = У2

P(Y = 0 I X = 1) = У4

P(Y = -1 I X = 1) = y4 P{Y = 1 I X = 0) = У4 P(Y = 0 I X = 0) = %

P(Y = -1 I X = 0) = У2.

Note that this example does not contradict Theorem 3.5.1. The word function implies that each value of the domain is mapped to a unique value

л

of the range, and therefore X cannot be regarded as a function of X.

Добавить комментарий

INTRODUCTION TO STATISTICS AND ECONOMETRICS

EXAMPLES OF HYPOTHESIS TESTS

In the preceding sections we have studied the theory of hypothesis testing. In this section we shall apply it to various practical problems. EXAMPLE 9.6.1 (mean of binomial) It is …

Multinomial Model

We illustrate the multinomial model by considering the case of three alternatives, which for convenience we associate with three integers 1, 2, and 3. One example of the three-response model …

Tests for Structural Change

Suppose we have two regression regimes (10.3.7) and Уь = a + 3i*T + ut, t= 1, 2, . . ■ • ,Ti (10.3.8) Tit = a + 32*21 + …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.