INTRODUCTION TO STATISTICS AND ECONOMETRICS

NEYMAN-PEARSON LEMMA

In this section we study the Bayesian strategy of choosing an optimal test among all the admissible tests and a practical method which enables us to find a best test of a given size. The latter is due to Neyman and Pearson

P

image397

figure 9.4 A set of admissible characteristics

and is stated in the lemma that bears their names. A Bayesian interpreta­tion of the Neyman-Pearson lemma will be pedagogically useful here.

We first consider how the Bayesian would solve the problem of hypothe­sis testing. For her it is a matter of choosing between HQ and Hx given the posterior probabilities P(H0 | x) and P{H | x) where x is the observed value of X. Suppose the loss of making a wrong decision is as given in Table 9.2. For example, if we choose H0 when Hx is in fact true, we incur a loss y2.

Assuming that the Bayesian chooses the decision for which the expected loss is smaller, where the expectation is taken with respect to the posterior distribution, her solution is given by the rule

(9.3.1) Reject H0 if у! Р(Я01 x) <у2Р(Я1|х).

In other words, her critical region, R$, is given by

(9.3.2) Po = {x І ЧіР(Н01 x) < у2РЩ I x) )•

Alternatively, the Bayesian problem may be formulated as that of deter­mining a critical region R in the domain of X so as to

(9.3.3) Minimize ф(Р) = yiP(H0 X Є R)P(X Є R)

+ учР{Нх I X Є К)Р(Х Є R).

table 9.2 Loss matrix in hypothesis testing

State of Nature

Decision

Ho

Hx

Ho

0

72

H

7i

0

We shall show that Rq as defined by (9.3.2) is indeed the solution of

(9.3.3) . Let Ri be some other set in the domain of X. Then we have

(9.3.4) ф(До) = 4iP(,Hq Ro П Ді)Д(До П Ді)

+ y1P(H0R0nR1)P(R<)nR1)

+ y2P(Hi I Ro П Д,)Д(Д0 П Rj)

+ y2P(Hj I R0 П Rr)P(Ro П R,)

and

(9.3.5) ф(Ді) = 4iP(Hq R П Ro)P(R П R0)

+ ^PiHolR, П Р0)^№ П Ro)

+ y2P(Hj I R, П R0)P(R1 П До)

+ y2P(Hi I Ri П R0)P(Ri П До).

Compare the terms on the right-hand side of (9.3.4) with those on the right-hand side of (9.3.5). The first and fourth terms are identical. The second and the third terms of (9.3.4) are smaller than the third and the second terms of (9.3.5), respectively, because of the definition of До given in (9.3.2). Therefore we have

(9.3.6) ф(До) < ф(Ді).

We can rewrite ф(Д) as

(9.3.7) ф(Д) = 7iP(H0)P(R I Ho) + y2P(Hi)P(R H,)

= ті0а(Д) + ^^(Д),

where % = yjP(Ho), т]і = y2P(Hi), and P(Hq) and P(H{) are the prior probabilities for the two hypotheses. When the minimand is written in the
form of (9.3.7), it becomes clear that the Bayesian optimal test R0 is determined at the point where the curve of the admissible characteristics on the a, P plane, such as those drawn in Figures 9.2 and 9.4, touches the line that lies closest to the origin among all the straight lines with the slope equal to — тПо/Лі - If the curve is differentiable as in Figure 9.4, the point of the characteristics of the Bayesian optimal test is the point of tangency between the curve of admissible characteristics and the straight line with slope —По/'Л l-

The classical statistician does not wish to specify the losses yi and y2 or the prior probabilities P{H0) and P(Hi); hence he does not wish to specify the ratio т|0/тр, without which the minimization of (9.3.7) cannot be carried out. The best he can do, therefore, is to obtain the set of admissible tests. This attitude of the classical statistician is analogous to that of the economist who obtains the Pareto optimality condition without specifying the weights on two people’s utilities in the social welfare function.

By virtue of Theorem 9.2.1, which shows the convexity of the curve of admissible characteristics, the above analysis implies that every admissible test is the Bayesian optimal test corresponding to some value of the ratio т|0/тр. This fact is the basis of the Neyman-Pearson lemma. Let L(x) be the joint density or probability of X depending on whether X is continuous or discrete. Multiply both sides of the inequality in (9.3.2) by L(x) and replace Р(Ні I x)L(x) with L(x | Нг)Р(Нг), і = 0, 1. Then the Bayesian optimal test Rq can be written as

image398(9.3.8)

Thus we have proved

THEOREM 9.3.1 (Neyman-Pearson lemma) In testing Hq: 0 = 0q against Hj: 0 = 0i, the best critical region of size a is given by

image399(9.3.9)

where L is the likelihood function and c (the critical value) is determined so as to satisfy

(9.3.10) P(R I 0O) = a,

provided that such c exists. (Here, as well as in the following analysis, 0 may be a vector.)

The last clause in the theorem is necessary because, for example, in Example 9.2.1 the Neyman-Pearson test consists of (1), (4), (7), and (8), and there is no c that satisfies (9.3.10) for a = У2.

theorem 9.3.2 The Bayes test is admissible.

Proof. Let Rq be as defined in (9.3.2). Then, by (9.3.7),

(9.3.11) tW-Ro) + tuP(Ro) - Tioa(R) + t^PCR).

Therefore, it is not possible to have a (R) < a(f?0) and (3(7?) < (3(7?o) with a strict inequality in at least one. □

The Neyman-Pearson test is admissible because it is a Bayes test.

The choice of a is in principle left to the researcher, who should determine it based on subjective evaluation of the relative costs of the two types of error. There is a tendency, however, for the classical statistician automatically to choose a = 0.05 or 0.01. A small value is often selected because of the classical statistician’s reluctance to abandon the null hy­pothesis until the evidence of the sample becomes overwhelming. We shall consider a few examples of application of Theorem 9.3.1.

EXAMPLE 9.3.1 LetX be distributed as B(n, p) and let x be its observed value. The best critical region for testing H0: p = p0 against HA p = p{ is, from (9.3.9),

p{l - Pi)

(9.3.12) --------------------- > c for some c.

Pi(1 - Po)n~X

Taking the logarithm of both sides of (9.3.12) and collecting terms, we get

Подпись: log — * Po Подпись:1 — A l — p

log -------- — > log C - n log ----------- —

1 - Po 1 - po

Suppose pi > po. Then the term inside the parentheses on the left-hand side of (9.3.13) is positive. Therefore the best critical region of size a is defined by (9.3.14) x > d, where d is determined by P(X > dH0)= a.

If pi < po, the inequality in (9.3.14) is reversed. The result is consistent with our intuition.

example 9.3.2 Let Xt be distributed as iV(|x, ст2), і = 1, 2, ,n, where

Подпись: (9.3.15) image403 image404 Подпись: > c Подпись: for some c.

cr is assumed known. Let x, be the observed value of Xt. The best critical region for testing H0: p = p0 against Hx: jjl = pj is, from (9.3.9),

Taking the logarithm of both sides of (9.3.15) and collecting terms, we obtain

71

(9.3.16) (pi - p0) X xi > 0-2 log c + f (LL? “ M-o)-

Therefore if pj > po, the best critical region of size a is of the form

(9.3.17) x > d, where d is determined by P(X > d I H0) = a.

If pi < po, the inequality in (9.3.17) is reversed. This result is also consis­tent with our intuition.

In both examples the critical region is reduced to a subset of the domain of a univariate statistic (which in both cases is a sufficient statistic). There are often situations where a univariate statistic is used to test a hypothesis about a parameter. As stated in Section 9.1, such a statistic is called a test statistic. Common sense tells us that the better the estimator we use as a test statistic, the better the test becomes. Therefore, even in situations where the Neyman-Pearson lemma does not indicate the best test of a given size a, we should do well if we used the best available estimator of a parameter as a test statistic to test a hypothesis concerning the parame­ter. Given a test statistic, it is often possible to find a reasonable critical region on an intuitive ground. Intuition, however, does not always work, as the following counterexample shows.

EXAMPLE 9.3.3 Let the density of X be given by

(9.3.18) f(x, 0) =------------ І--------- if x > 0,

2(1 + x - 0)2

Подпись: if X < 0.= 1

2(1 - x + 0)2

Подпись: (9.3.19) image409

Find the Neyman-Pearson test of H0: 0 = 0 against Hx: 0 = 0X > 0. The densities under H0 and Hx are shown in Figure 9.5. We have

(1 + xf
(l - x + 0j)2
(1 + л: - OjT

The Neyman-Pearson critical region, denoted R, is identified in Figure 9.6. The shape of the function (9.3.19) changes with 0!. In the figure it is drawn assuming 0X = 1.

Добавить комментарий

INTRODUCTION TO STATISTICS AND ECONOMETRICS

EXAMPLES OF HYPOTHESIS TESTS

In the preceding sections we have studied the theory of hypothesis testing. In this section we shall apply it to various practical problems. EXAMPLE 9.6.1 (mean of binomial) It is …

Multinomial Model

We illustrate the multinomial model by considering the case of three alternatives, which for convenience we associate with three integers 1, 2, and 3. One example of the three-response model …

Tests for Structural Change

Suppose we have two regression regimes (10.3.7) and Уь = a + 3i*T + ut, t= 1, 2, . . ■ • ,Ti (10.3.8) Tit = a + 32*21 + …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.