Springer Texts in Business and Economics

Variance-Covariance Matrix of Random Effects

a. From (12.17) we get

Й = ct^In <8> Jt) + c^.In <8> It)

Replacing JT by TJT, and IT by (Et + JT) where ET is by definition (It — JT), one gets

Й = Tc^(In <8> Jt) + c^(In <8> Et) + c^.In <8> Jt)

collecting terms with the same matrices, we get

Й = (Tc^ C c2)(In <S> Jt) C cv2(In <S> Et) = стуР + cv2Q where Cj2 = Tc2 C c2.

b. p = z2(z;z2)“ z; = IN <S> JT is a projection matrix of Z2. Hence,

it is by definition symmetric and idempotent. Similarly, Q = INT — P is the orthogonal projection matrix of Z2. Hence, Q is also symmetric and idempotent. By definition, P + Q = INT. Also, PQ = P(Int—P) = P—P2 = P — P = 0.

c. From (12.18) and (12.19) one gets

П ^-1 = (ci2P C cv2Q) (%P C q) = P C Q = Int

Vc12 cv2 J

since P2 = P, Q2 = Q and PQ = 0 as verified in part (b). Similarly, £2 1 ^ = Int.

d. From (12.20) one gets

1 1 1 - P C-Q = Й-1

c2

using the fact that P2 = P, Q2 = Q and PQ = 0.

12.2 Fuller andBattese (1974) Transformation.

From (12.20) one gets ovfi_1 =2 = Q + (cfv/cti)P Therefore,

y* = CTvfi_‘ =2y = Qy + (crv/cTi)Py = y - Py + (ov/oi)Py = y - (1 - (crv/cri))Py = y - 0Py

where 0 = 1 — (ov/o1). Recall that the typical element of Py is y^, therefore, the typical element of y* is y* = yit — 0yi..

12.3 Unbiased Estimates of the Variance-Components. E(u0Pu) = E(tr(uu0P)) = tr(E(uu0)P) = tr. fiP). From (12.21), fiP = c2P since PQ = 0. Hence, from (12.18),

e (52) = HUh!) = 3jKI> = 2.

1 U tr(P) tr(P) 1

Similarly, E(u0Qu) = tr(fiQ) = tr (ct2Q) where the last equality follows from (12.18) and the fact that fiQ = ct2Q since PQ = 0. Hence, from (12.22),

E(u0Qu) _ tr (o? Q)

tr(Q) tr(Q)

12.4 Swamy andArora (1972) Estimates of the Variance-Components.

a. б2 given by (12.23) is the s2 from the regression given by (12.11). In fact &V = y'QpNT - Pqx]Qy/[N(T - 1) - K]

where PqX = QX(X, QX)_1X, Q. Substituting Qy from (12.12) into &2, one gets

cv2 = v0Q[Int - Pqx]Qv/[N(T - 1) - K] with Qv ~ (0, c2Q. Therefore,

E[v'Q[Int - Pqx]Qv] = E[tr{Qvv'Q[lNT - Pqx]}]

= ov2tr{Q - QPqx} = cv2{N(T - 1) - tr(PQX)}

where the last equality follows from the fact that QPqX = PqX. Also,

Py = PZ8 + Pu

In fact,

CT2 = y0P[INT - PPZ]Py/(N - K - 1)

where PPZ = PZ(Z, PZ)-1Z, P. Substituting Py from (3) into cr2 one gets CT2 = u0P[Int - Ppz]Pu/(N - K - 1 ) = u'PPnt - Ppz]Pu/(N - K - 1 ) with Pu ~ (0, ст2р as can be easily verified from (12.18). Therefore, E(u0P[Int - Ppz]Pu) = E[tr{Puu0P(Int - Ppz)} = ct x2tr{P - PPpz}

= erf - tr(Ppz)}

where the last equality follows from the fact that PPPZ = Ppz. Also, tr(PPZ) = tr(Z, PZ)(Z, PZ)- 1 = tr(IK0) = K0.

Hence, E (&2) = ct2.

12.6 System Estimation.

a. OLS on (12.27) yields

(Z0QZ + Z0PZ)- 1 (Z0Qy + Z0Py)

(Z0(Q + P)Z)- 1 Z0(Q + P)y = (Z0Z)- 1 Z0y

since Q + P = Int.

b. GLS on (12.27) yields

— (Z0fi-1Z)-1Z0fi-1y

where 1 is given by (12.19).

12.7 Random Effects Is More Efficient than Fixed Effects. From (12.30) we have var("gls) — a2 [Wxx C ¥2Bxx] 1

where WXX — X0QX, BXX — X0(P — JNT)X and ф2 — ст^/ст2. The Within

^2W-X:

(var("gls)) 1 — (var("Within)) 1 — -7 [Wxx C ¥2Bxx]--------------------------- 2 Wxx — ¥2Bxx/ct2

av2 av2

which is positive semi-definite. Hence, var("Within)—var("GLS) is positive semi­definite. This last result uses the well known fact that if A-1 — B-1 is positive semi-definite, then B — A is positive semi-definite.

12.8 Maximum Likelihood Estimation of the Random Effects Model.

a. Differentiating (12.34) with respect to ф2 yields

9Lc NT d0 (P — Jnt) d | N 1

Эф2 — — ~ ‘ d0 [Q C ¥2(P — Jnt)] d C 2 ‘ ф2

setting @Lc/@¥2 — 0, we get Td0(P — JNT)d¥2 — d0Qd C ¥2d0(P — JNT)d solving for ф2, we get¥2(T — 1)d0 (P — JNT) d — d0Qd which yields (12.35).

b. Differentiating (12.34) with respect to " yields

9Lc NT 1 9

= --2 ^ d0 [Q + ¥2(P - JNT)] d ^ 9"

9Lc 9d0[Q + ¥2(P - JNT)]d

setting = 0 is equivalent to solving----------------------------------------- —------------------- = 0

Using the fact that d = y — X", this yields -2X0 [Q + ф2 (P - JOT)] (y - X") = 0

Solving for " ,we get X0 [Q + ¥2(P - JNT)]X" = X0 [Q + ¥2(P - JNT)]y which yields (12.36).

12.9 Prediction in the Random Effects Model.

a. From (12.2) and (12.38), E(uiT+Sujt) = CTt2, for i = j and zero otherwise. The only correlation over time occurs because of the presence of the same individual across the panel. The vit’s are not correlated for different time periods. In vector form

w = E(ui, T+S u) = стДО,.. ,0,.., 1.., 1,.. ,0,.. ,O)0

where there are T ones for the i-th individual. This can be rewritten as w = стД'і <S> it) where 'i is the i-th column of In, i. e., 'i is a vector that has 1 in the i-th position and zero elsewhere. tT is a vector of ones of dimension T. b. ('/ <8> it0) P = ('/ <S> lt0) ^In <S> “^T^ = ('i0 ® lt0).

Therefore, in (12.39)

since ('( <g> tT0) Q = ('i0 <8> D0) (Int - P) = ('i0 <S> lt0) - ('/ <8> lt0) = 0.

12.10 Using the motor gasoline data on diskette, the following SAS output and pro­gram replicates the results in Tables 12.1-12.7. The program uses the IML procedure (SAS matrix language). This can be easily changed into GAUSS or any other matrix language.

RESULTS OF OLS
PARAMETER STANDARD ERROR

INTERCEPT

2.39133

0.11693

20.45017

INCOME

0.88996

0.03581

24.85523

PRICE

-0.89180

0.03031

-29.4180

CAR

-0.76337

0.01861

-41.0232

RESULTS OF BETWEEN

LOOK1

PARAMETER STANDARD ERROR

T-STATISTICS

INTERCEPT

2.54163

0.52678

4.82480

INCOME

0.96758

0.15567

6.21571

PRICE

-0.96355

0.13292

-7.24902

CAR

-0.79530

0.08247

-9.64300

RESULTS OF WITHIN

LOOK1

PARAMETER STANDARD ERROR

T-STATISTICS

INCOME

0.66225

0.07339

9.02419

PRICE

-0.32170

0.04410

-7.29496

CAR

-0.64048

0.02968

-21.5804

RESULTS OF WALLACE-HUSSAIN

LOOK1

PARAMETER STANDARD ERROR

T-STATISTICS

INTERCEPT

1.90580

0.19403

9.82195

INCOME

0.54346

0.06353

8.55377

PRICE

-0.47111

0.04550

-10.3546

CAR

-0.60613

0.02840

-21.3425

RESULTS OF AMEMIYA

LOOK1

PARAMETER STANDARD ERROR

T-STATISTICS

INTERCEPT

2.18445

0.21453

10.18228

INCOME

0.60093

0.06542

9.18559

PRICE

-0.36639

0.04138

-8.85497

CAR

-0.62039

0.02718

-22.8227

RESULTS OF SWAMY-ARORA

LOOK1

PARAMETER STANDARD ERROR

T-STATISTICS

INTERCEPT

1.99670

0.17824

11.20260

INCOME

0.55499

0.05717

9.70689

PRICE

-0.42039

0.03866

-10.8748

CAR

-0.60684

0.02467

-24.5964

LOOK1

T-STATISTICS

ONE-WAY ERROR COMPONENT MODEL WITH GASOLINE DATA:
BETA, VARIANCES OF BETA, AND THETA

ESTIMATORS

BETA1

BETA2

BETA3

STD. BETA1

STD. BETA2

STD. BETA3

THETA

OLS

0.88996

-0.89180

-0.76337

0.03581

0.03031

0.01861

0.00000

BETWEEN

0.96758

-0.96355

-0.79530

0.15567

0.13292

0.08247

WITHIN

0.66225

-0.32170

-0.64048

0.07339

0.04410

0.02968

1.00000

WALLACE & HUSSAIN

0.54346

-0.47111

-0.60613

0.06353

0.04550

0.02840

0.84802

AMEMIYA

0.60093

-0.36639

-0.62039

0.06542

0.04138

0.02718

0.93773

SWAMY & ARORA

0.55499

-0.42039

-0.60684

0.05717

0.03866

0.02467

0.89231

NERLOVE

0.66198

-0.32188

-0.64039

0.07107

0.04271

0.02874

0.99654

IMLE

0.58044

-0.38582

-0.61401

0.06286

0.04072

0.02647

0.92126

NEGATIVE VAR. MHU NEGA. VAR

OLS ESTIMATOR.

BETWEEN ESTIMATOR.

WITHIN ESTIMATOR.

WALLACE & HUSSAIN ESTIMATOR 0

AMEMIYA ESTIMATOR 0

SWAMY & ARORA ESTIMATOR 0

NERLOVE ESTIMATOR.

IMLE.

SAS PROGRAM

Options linesize=162;
Data Gasoline;

Infile ‘b:/gasoline. dat’ firstobs=2;

Input @1 Country $ @10 Year @15 Lgaspcar @29 Lincomep @44 Lprpmg @61 Lcarpcap;

Proc IML;

Use Gasoline; Read all into Temp;

N=18;T=19; NT=N*T;

One=Repeat(1,NT,1);

X=Temp[,3:5]; Y=Temp[,2]; Z=One||X; K=NCOL(X); Lt=J(T,1,1); JT=(l_t*l_t'); Z_U=I(N)@l_t; P=Z_U*INV(Z_U'*Z_U)*Z_U'; Q=I(NT)-P;

JNT=Repeat(JT, N,N); J NT_BAR=J NT/NT;

........ OLS ESTIMATORS................... *;

OLS_BETA=INV(Z'*Z)*Z'*Y;

OLS_RES=Y-Z*OLS_BETA;

VAR_REG=SSQ(OLS_RES)/(NT-NCOL(Z));

VAR_COV=VAR_REG*INV(Z'*Z);

STD_OLS=SQRT(VECDIAG(VAR_COV));

T_OLS=OLS_BETA/STD_OLS;

LOOK1=OLS_BETA||STD_OLS||T_OLS;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’}; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT‘RESULTS OF OLS’,,

LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|);

........ BETWEEN ESTIMATOR............................ *;

BW_BETA=INV(Z'*P*Z)*Z'*P*Y;

BW_RES=P*Y-P*Z*BW_BETA;

VAR_BW=SSQ(BW_RES)/(N-NCOL(Z));

V_C_BW=VAR_BW*INV (Z'*P*Z);

STD_BW=SQRT(VECDIAG(V_C_BW));

T_BW=BW_BETA/STD_BW;

LOOK1=BW_BETA||STD_BW||T_BW;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’g; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT‘RESULTS OF BETWEEN’,,

LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|);

* ...... WITHIN ESTIMATORS.................... *;

WT_BETA=INV(X'*Q*X)*X'*Q*Y;

WT_RES=Q*Y-Q*X*WT_BETA;

VAR_WT=SSQ(WT_RES)/(NT-N-NCOL(X));

V_C_WT=VAR_WT*INV(X'*Q*X);

STD_WT=SQRT(VECDIAG(V_C_WT));

T_WT=WT_BETA/STD_WT;

LOOK1=WT_BETA||STD_WT||T_WT;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’}; RTITLE={‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT‘RESULTS OF WITHIN’,,

LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|);

* -- WALLACE & HUSSAIN ESTIMATOR OF VARIANCE COMPONENTS --- *;

WH_V_V=(OLS_RES'*Q*OLS_RES)/(NT-N);

WH_V_1=(OLS_RES'*P*OLS_RES)/N;

******* Checking for negative VAR. MHU *******

WH_V_MHU=(WH_V_1-WH_V_V)/T;

IF WH_V_MHU<0THEN NEGA_WH=1; ELSE NEGA_WH=0; WH_V_MHU=WH_V_MHU # (WH_V_MHU>0); WH_V_1=(T*WH_V_MHU)+WH_V_V;

OMEGA_WH=(Q/WH_V_V)+(P/WH_V_1);

WH_BETA=INV(Z'*OMEGA_WH*Z)*Z'*OMEGA_WH*Y;

THETA_WH=1-(SQRT(WH_V_V)/SQRT(WH_V_1));

OMEGAWH=(Q/SQRT(WH_V_V))+(P/SQRT(WH_V_1));

WH_RES=(OMEGAWH*Y)-(OMEGAWH*Z*WH_BETA);

VAR_WH=SSQ(WH_RES)/(NT-NCOL(Z));

V_C_WH=INV(Z'*OMEGA_WH*Z);

STD_WH=SQRT(VECDIAG(V_C_WH));

T_WH=WH_BETA/STD_WH;

LOOK1=WH_BETA|| STD_WH ||T_WH;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’}; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT ‘RESULTS OF WALLACE-HUSSAIN’,, LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|); FREE OMEGA. WH OMEGAWH WH. RES;

* -- AMEMIYA ESTIMATOR OF VARIANCE COMPONENTS --- *;

Y_BAR=Y[:]; X_BAR=X[:,]; ALPHA_WT=Y_BAR-X_BAR*WT_BETA; LSDV_RES=Y-ALPHA_WT*ONE-X*WT_BETA; AM_V_V=(LSDV_RES'*Q*LSDV_RES)/(NT-N);

AM_V_1=(LSDV_RES'*P*LSDV_RES)/N;

***** Checking for negative VAR_MHU *********;

AM_V_MHU=(AM_V_1-AM_V_V)/T;

IF AM_V_MHU<0 THEN NEGA_AM=1; ELSE NEGA_AM=0; AM_V_MHU=AM_V_MHU # (AM_V_MHU>0); AM_V_1=(T*AM_V_MHU)+AM_V_V;

OMEGA_AM=(Q/AM_V_V)+(P/AM_V _1); AM_BETA=INV(Z'*OMEGA_AM*Z)*Z'*OMEGA_AM*Y;

THETA_AM=1-(SQRT(AM_V_V)/SQRT(AM_V_1));

OMEGAAM=(Q/SQRT(AM_V_V))+(P/SQRT(AM_V_1));

AM_RES=(OMEGAAM*Y)-(OMEGAAM*Z*AM_BETA);

VAR^M=SSQ(AM_RES)/(NT-NCOL(Z));

V_C^M=INV(Z*OMEGA^M*Z);

STD^M=SQRT(VECDIAG(V_C_AM));

T^AM=AM_BETA/STD_AM;

LOOK1=AM_BETA||STD_AM||T^M;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’}; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT‘RESULTS OF AMEMIYA’,,

LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|); FREE OMEGA. AM OMEGAAM AM_RES;

* --- SWAMY & ARORA ESTIMATOR OF VARIANCE COMPONENTS ---- *;

SA_V_V=(Y' *Q*Y-Y' *Q*X*INV(X *Q*X)*X *Q*Y)/(NT-N-K);

SA_V_1=(Y *P*Y-Y *P*Z*I NV(Z' *P*Z)*Z' *P*Y)/(N-K-1);

****** Checking for negative VAR_MHU ********;

SA_V_MHU=(SA_V_1-SA_V_V)/T;

IF SA_V_MHU<0THEN NEGA_SA=1; ELSE NEGA_SA=0; SA_V_MHU=SA_V_MHU # (SA_V_MHU>0); SA_V_1=(T*SA_V_MHU)+SA_V_V;

OMEGA_SA=(Q/SA_V_V)+(P/SA_V _1); SA_BETA=INV(Z' *OMEGA_SA*Z)*Z' *OMEGA_SA*Y; THETA_SA=1 -(SQRT(SA_V_V)/SQRT(SA_V_1)); OMEGASA=(Q/SQRT(SA_V_V))+(P/SQRT(SA_V_1)); SA_RES=(OMEGASA*Y)-(OMEGASA*Z*SA_BETA); VAR_SA=SSQ(SA_RES)/(NT-NCOL(Z));

V_C_SA= IN V(Z *OMEGA_SA*Z);

STD_SA=SQRT(VECDIAG(V_C_SA));

T_SA=SA_BETA/STD_SA;

LOOK1=SA_BETA||STD_SA||T_SA;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’g; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT ‘RESULTS OF SWAMY-ARORA’,,

LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|); FREE OMEGA. SA OMEGASA SA_RES;

- NERLOVE ESTIMATOR OF VARIANCE COMPONENTS AND BETA --*

MHU=P*LSDV_RES;

MEAN_MHU=MHU[:];

DEV_MHU=MHU-(ONE*MEAN_MHU);

VAR_MHU=SSQ(DEV_MHU)/T*(N-1);

NL_V_V=SSQ(WT_RES)/NT;

NL_V_1=T*VAR_MHU+NL_V_V;

OMEGA_NL=(Q/NL_V_V)+(P/NL_V_1);

NL_BETA=INV(Z'*OMEGA_NL*Z)*Z'*OMEGA_NL*Y;

THETA_NL=1 -(SQRT(NL_V_V)/SQRT(NL_V_1));

OMEGANL=(Q/SQRT(NL_V_V))+(P/SQRT(NL_V_1));

NL_RES=(OMEGANL*Y)-(OMEGANL*Z*NL_BETA);

VAR_NL=SSQ(NL_RES)/(NT-NCOL(Z));

V_C_NL=INV(Z'*OMEGA_NL*Z);

STD_NL=SQRT(VECDIAG(V_C_NL));

T_NL=NL_BETA/STD_NL;

LOOK1=NLBETA| |STD_NL| | T_NL;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’}; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT‘RESULTS OF NERLOVE’,,

LOOK1(| COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5| ); FREE OMEGA. NL OMEGANL NLRES;

*--- MAXIMUM LIKELIHOOD ESTIMATION ----*;

/* START WITH WITHIN AND BETWEEN BETA SUGGESTED BY BREUSCH(1987) */;

CRITICAL=1; BETA_W=WT_BETA; BETA_B=BW_BETA[2:K+1,]; BETA_MLE=WT_BETA;

DO WHILE (CRITICAL>0.0001); WT_RES=Y - X*BETA_W; BW_RES=Y - X*BETA_B;

PHISQ_W=(WT_RES'*Q*WT_RES)/((T-1)*(WT_RES'*(P-JNT_BAR)*WT_RES));

PHISQ_B=(BW_RES'*Q*BW_RES)/((T-1)*(BW_RES'*(P-JNT_BAR)*BW_RES));

CRITICAL=PHISQ_W-PHISQ_B;

BETA_W=INV(X'*(Q+PHISQ_W*(P-JNT_BAR))*X)*X'*(Q+PHISQ_W*(P-JNT_

BAR))*Y;

BETA_B=INV(X'*(Q+PHISQ_B*(P-JNT_BAR))*X)*X'*(Q+PHISQ_B*(P-JNT_BAR))*Y;

BETA_MLE=(BETA_W+BETA_B)/2;

END;

D_MLE=Y-X*BETA_MLE;

PHISQ_ML=(D_MLE' *Q*D_MLE)/((T-1)*D_MLE' *(P-JNT_BAR)*D_MLE); THETA_ML=1-SQRT(PHISQ_ML);

VAR_V_ML=D_MLE' *(Q+PHISQ_ML*(P-JNT_BAR))*D_MLE/NT;

VAR _1 _ML=VAR_V_ML/PHISQ_ML; OMEGA_ML=(Q/VAR_V_ML)+(P/VAR_1_ML);

MLBETA=INV(Z'*OMEGA_ML*Z)*Z' *OMEGA_ML*Y; OMEGAML=(Q/SQRT(VAR_V_ML))+(P/SQRT(VAR_1_ML));

ML_RES=(OMEGAML*Y)-(OMEGAML*Z*ML_BETA); VAR_ML=SSQ(ML_RES) / (NT-NCOL(Z)); V_C_ML=INV(Z' *OMEGA_ML*Z);

STD_ML=SQRT (VECDIAG (V_C_ML)); T_ML=ML_BETA/STD_ML;

LOOK1=ML_BETA| |STD_ML| |T_ML;

CTITLE={‘PARAMETER’ ‘STANDARD ERROR’ ‘T-STATISTICS’}; RTITLE={‘INTERCEPT’ ‘INCOME’ ‘PRICE’ ‘CAR’};

PRINT ‘RESULTS OF MAXIMUM LIKELIHOOD’,,

LOOK1(|COLNAME=CTITLE ROWNAME=RTITLE FORMAT=8.5|);

FREE OMEGA_ML;

*.......... PRINT AND OUTPUT INFORMATION.......................... *;

BETA=OLS_BETA' [,2:K+1]//BW_BETA' [,2:K+1]//WT_BETA'//WH_BETA' [,2: K+1]// AM_BETA' [,2:K+1]//S^BETA'[,2:K+1]//NLBETA'[,2:K+1]//MLBETA'[,2:K+1]; STD_ERR=STD_OLS'[,2:K+1]//STD_BW'[,2:K+1]//STD_WT'//STD_WH'[,2:K+1]// STD_AM'[,2:K+1]//STD_SA'[,2:K+1]//STD_NL'[,2:K+1]//STD_ML'[,2:K+1]; THETAS={0,.,1 }//THETA_WH//THETA_AM//THETA_SA//THETA_NL//THETA_ML;

NEGA_VAR={.,.,.}//NEGA_WH//NEGA_AM//N EGA_SA//{.,.}; OUTPUT=BETA| |STD_ERR| | THETAS | |NAGA_VAR;

C2={”BETA1” ”BETA2” ”BETA3” ”STD_BETA1” ”STD_BETA2” ‘‘STD_BETA3’’ ‘‘THETA’’};

R={‘‘OLS ESTIMATOR’’ ‘‘BETWEEN ESTIMATOR’’ ‘‘WITHIN ESTIMATOR’’ ‘‘WALLACE & HUSSAIN ESTIMATOR’’

‘‘AMEMIYA ESTIMATOR’’ ‘‘SWAMY & ARORA ESTIMATOR’’ ‘‘NERLOVE ESTIMATOR’’ ‘‘IMLE’’};

PRINT ‘ONE-WAY ERROR COMPONENT MODEL WITH GASOLINE DATA: BETA, VARIANCES OF BETA, AND THETA’

„OUTPUT ( | ROWNAME=R COLNAME=C2 FORMAT=8.5| );

PRINT ‘NEGATIVE VAR_MHU’,,NEGA_VAR ( | ROWNAME=R| );

12.11 Bounds on s2 in the Random Effects Model.

a. This solution is based on Baltagi and Kramer (1994). From (12.3), one gets • ois = (Z0Z)_1Z0y and Uois = y - Z8<,is = Pzu where Pz = Int - Pz with Pz = Z(Z, Z)“1Z/. Also,

E(s2) = E[U0u/(NT-K0)] = E [u0PZu/(NT - K0)] = tr(^PZ)/(NT-K0)

which from (12.17) reduces to

E(s2) = ct/ + ct/(NT - tr(lN ® Jt)Pz)/(NT - K0)

since tr(INT) = tr(IN <g> JT) = NT

and tr(PZ) = K0. By adding and subtracting ct/, one gets E(s2) = ct2 + ct/[K0 - tr(lN ® Jt)Pz]/(NT - K0)

where ct2 = E (u2t) = ct/ + ct/ for all i and t.

b. Nerlove (1971) derived the characteristic roots and vectors of £2 given in (12.17). These characteristic roots turn out to be ct/ with multiplicity N(T - 1) and ^Tct/ + ct/) with multiplicity N. Therefore, the smallest (n - K0) characteristic roots are made up of the (n-N) ct/’s and (N - K0) of the ^Tct/ + ct/) ’s. This implies that the mean of the (n - K0) smallest char­acteristic roots of £2 = (n - N)ct/ + (N - K0) ^Tct/ + ct/) /(n - K0).

Similarly, the largest (n - K0) characteristic roots are made up of the N (Tct/ + ct/) ’s and (n - N - K0) of the ct/ ’s. This implies that the mean of the (n - K0) largest characteristic roots of £2 = N (Tct/ + ct/) + (n - N - K0)ct/ /(n - K0). Using the Kiviet and Kramer (1992) inequalities, one gets 0 < ct/ + (n - TK0)CT//(n - K0) < E(s2) < ct/ + пст//(п - K0) < пст2/(п - K0). As n! 1, both bounds tend to ct2, and s2 is asymptotically unbiased, irrespective of the particular evolution of X.

12.12 M = INT - Z(Z0Z)-1Z0 andM* = INT - Z*(Z*0Z*)-1Z*0 are both symmetric and idempotent. From (12.43), it is clear that Z = Z*I* with I* = (iN <S> %), in being a vector of ones of dimension N and K0 = K + 1.

MM* = INT - Z(Z0Z)-1Z0 - Z*(Z*0Z*)-1Z*0 + Z(Z'Z)~1Z'Z*(Z*'Z*)~1Z*'

Substituting Z = Z*I* , the last term reduces to

Hence, MM* = INT - Z*(Z*0Z*)-1Z*0 = M*.

12.13 This problem differs from problem 12.12 in that Z = E-1 =2 Z and Z* = £_112Z*. Since Z = Z*I*, premultiplying both sides by E-1 =2 one gets Z = Z*I*. Define

M = INT - Z(Z0Z)-1Z0 and

M* = INT - Z*(Z*0Z*)-1Z*0

Both are projection matrices that are symmetric and idempotent. The proof of MM * = M * is the same as that of MM* = M* given in problem 12.12 with Z replacing Z and Z* replacing Z*.

12.16 a. "GLS = (X0^-1X)-1X0^-1y = " + (X0^-1X)-1X0^-1u with E(" gls) = "

" Within = (X0QX)-1 X0Qy = " + (X0QX)-1X0Qu with E(" within) = " Therefore, q = "gls - "within has E(q) = 0. cov("gls, q) = E("gls - ")(q0)

= E("GLS - ")[("GLS - ")0 - ("Within - ] = var("gls) - cov("GLS, "Within)

= E[(X0^-1 X)-1X0^-W^-1X(X0^-1X)-1]
- E[(X0^-1X)-1X0^-WQX(X0QX)-1]

= (X,^“1X)“1 - (X,^“1X)“1X,^“1^QX(X/QX)_1 = (X,^“1X)“1 - (X,^“1X)“1 = 0. b. Using the fact that"Within = — q + "GLS, one gets var(" Within) = var(—q + " gls ) = var(q) + var(" gls), since cov("GLS, q) = 0. Therefore,

var(q) = var(" Within) — var(" gls ) = ^(X'QX)-1 — (X,^_1X)_1.
Detailed solutions for problems 12.17, 12.18 and 12.21 are given in Baltagi (2009).

References

Baltagi, B. H. (2009), A Companion to Econometric Analysis of Panel Data (Wiley: Chichester).

Baltagi, B. H. and W. Kramer (1994), “Consistency, Asymptotic Unbiasedness and Bounds on the Bias of s2 in the Linear Regression Model with Error Components Disturbances,” Statistical Papers, 35: 323-328.

Davidson, R. and J. G. MacKinnon, 1993, Estimation and Inference in Econometrics (Oxford University Press, New York).

Fuller, W. A. and G. E. Battese (1974), “Estimation of Linear Models with Cross­Error Structure,” Journal of Econometrics, 2: 67-78.

Nerlove, M. (1971), “A Note on Error Components Models,” Econometrica, 39: 383-396.

Swamy, P. A.V. B. and S. S. Arora (1972), “The Exact Finite Sample Properties of the Estimators of Coefficients in the Error Components Regression Models,” Econometrica, 40: 261-275.

CHAPTER 13

Добавить комментарий

Springer Texts in Business and Economics

The General Linear Model: The Basics

7.1 Invariance of the fitted values and residuals to non-singular transformations of the independent variables. The regression model in (7.1) can be written as y = XCC-1" + u where …

Regression Diagnostics and Specification Tests

8.1 Since H = PX is idempotent, it is positive semi-definite with b0H b > 0 for any arbitrary vector b. Specifically, for b0 = (1,0,.., 0/ we get hn …

Generalized Least Squares

9.1 GLS Is More Efficient than OLS. a. Equation (7.5) of Chap. 7 gives "ois = " + (X'X)-1X'u so that E("ois) = " as long as X and u …

Как с нами связаться:

Украина:
г.Александрия
тел./факс +38 05235  77193 Бухгалтерия

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов оборудования:

Внимание! На этом сайте большинство материалов - техническая литература в помощь предпринимателю. Так же большинство производственного оборудования сегодня не актуально. Уточнить можно по почте: Эл. почта: msd@msd.com.ua

+38 050 512 1194 Александр
- телефон для консультаций и заказов спец.оборудования, дробилок, уловителей, дражираторов, гереторных насосов и инженерных решений.