A COMPANION TO Theoretical Econometrics

The Box-Jenkins class of models

The simplest time series model of serial correlation is the first-order autoregressive (AR(1)) process, which for a time series, yt, t = 1,..., n, with mean zero can be written as

yt = pyt-1 + £t, | p I < 1, et ~ iid(0, о2). (3.1)

et is the error or innovation of the process at time t and is assumed to be inde­pendently distributed with mean zero and constant variance о2. The requirement on the scalar parameter p that | p | < 1, is called the stationarity condition which stops yt from being an explosive process and ensures its variance is a constant о2/(1 - p2). Model (3.1) is incomplete without a statement or assumption about how y0 is generated. A standard assumption is that the process has been going on forever and y0 has the same distribution as any other yt (stationarity assumption). This is often expressed by y0 being distributed with mean zero and variance о2/(1 - p2). An alternative approach is to treat y0 as a constant - often zero. This results in a nonstationary process because its variance is not constant.

If we denote by y = (y1,..., yn)', the vector of observations on the process, then y has a mean which is the n x 1 vector of zeros and has the covariance matrix in the stationary case of

The latter follows from

cov(yt, yt-;) = p!' var(yt) = po2/(1 - p2), i = 1,..., n - 1.

This implies that the AR(1) process has an autocorrelation function of the form

P(9 = cov( Уv Vt-i)/(var( yt) var(Уі-г))1/2 = P!'

which declines at an exponential rate as i increases.

Another simple model is the first-order moving average (MA(1)) process which can be written as

yt = et + yet-i, £t ~ iid(0, о2) (3.2)

where et is defined as in (3.1). The n x 1 vector y has mean zero and covariance matrix

1 + y 2


0 ^

............... 0 "


1 + y 2





1 + y 2


var(y) =



1 + y 2 Y



... y 1 + Y2

Note that var(yt) =

о 2(1 + Y2) and the

autocorrelation function of an



cess is

P(i) = Y/(1 + Y2), i = 1 = 0, i > 1.

We usually assume |y| < 1 although it is possible to have an MA(1) process with|y| > 1, but for normally distributed errors, it is impossible to distinguish between the likelihood function for (3.2) with (y *, о *2) = (y, о2) and (y *, о *2) = (1/Y, о2y2) because (3.3) takes the same value for these two sets of parameter values. The standard solution to this minor identification problem is to restrict Y to the interval |y| < 1. This is known as the invertibility condition. An MA(1) process is stationary because yt is a simple weighted sum of the innovations e t, et-1, so no condition is required for stationarity.

If we combine (3.1) and (3.2), we get an autoregressive moving average (ARMA(1, 1)) process which can be written as

У t = РУt-1 + et + Yet-1, I pi < 1, e t ~ iid(0, о2). (3.4)

Note that

var(y t) = о 2{(1 + Y2) + 2YP}/(1 - P2) cov(yt, yt-i) = p var(yt) + Y02, for i = 1 = p! var(yt), for i > 2.

It is also worth observing that if

Vt = et, (3.5)

i. e., we have a white noise model for yt with no serial correlation, then by lagging

(3.5) one period, multiplying it by p and subtracting from (3.5) we get

Vt = PVt-i + Є - P£t-i

which is (3.4) with у = - p. This is known as a model with a common factor. It appears to be an ARMA(i, 1) model but it is in fact a white noise model.

As in the AR(1) case, (3.4) is not complete until we have made an assumption about Vo. Again the usual assumption is stationarity, i. e. to assume that (3.4) is a process that has been going on forever.

The pth order autoregressive process (AR(p)) is a generalization of (3.1) and can be written as

Vt = PiVt-i + P2Vt-2 + ... + PpVt-p + Є^ Є ~ iid(0, о2). (3.6)

The stationarity condition generalizes to the requirement that the roots of the polynomial equation

1 - PiZ - P2Z2 - ... - PpZp = 0 (3.7)

lie outside the unit circle. In other words, all the roots of (3.7) must be larger than one in absolute value. For an AR(2) process, this requires

p1 + p2 < 1, p2 - p1 < 1 and p2 > -1.

In a similar way, the qth order moving average process (MA(q)) is a generaliza­tion of (3.2) and has the form

Vt = єt + УА-1 + ... + YqEt-q, £t ~ iid(0, о2). (3.8)

The invertibility condition now becomes such that the roots of the polynomial equation

1 + y1z + y2z2 + ... + Yqzq = 0

lie outside the unit circle. Again no further condition is required for stationarity. Observe that

var( Vt) = о 2(1 + y 1 + ... + Y ^)

cov(VtVt-i) = о 2(y + Y1Y+1 + ... + Yq-iYq), for i = 1,..., q, = 0, for i > q.

In addition, (3.6) and (3.8) can be combined to produce an ARMA(p, q) process which has the form

yt = P1 yt-1 + P2yt-2 + ... + PpVt-p + єt + YA - 1 + ... + Yqet_q, et ~ iid(0, о2).


This class of models is popular because it can often allow the serial correlation in a stationary time series to be modeled with a handful of parameters.

A further important generalization of the class of ARMA(p, q) models is the class of autoregressive integrated moving average (ARIMA) models. If L denotes the lag operator such that Lyt = yt-1 then the ARIMA(p, d, q) model is just an ARMA(p, q) model applied to the transformed variable

(1 _ L)dyt (3.10)

rather than to yt. When d = 1, then (1 - L)dyt = yt - yt-1 which is the first difference of yt. Often yt is not a stationary series but its first (or higher) difference, yt - yt-1, is. Hence it may be sensible to fit an ARMA model to the first (or higher) differ­ence of yt.

A special ARIMA model is the ARIMA(0, 1, 0) model which is also known as the random walk model because it has the form

yt = yt-1 + єt, Єt ~ iid(0, о2). (3.11)

Observe that (3.11) can also be written as

yt = У0 + є1 + ... + є

which means var(yt - y0) = to2, a quantity that goes to infinity as t increases.

Finally there is the related class of fractionally integrated models, denoted ARFIMA, in which the d in (3.10) is not restricted to taking only an integer value.

For further details on ARIMA models, see Box and Jenkins (1970, 1976). A good survey of fractionally integrated processes and their applications in econometrics is given by Baillie (1996).

Добавить комментарий

A COMPANION TO Theoretical Econometrics

Normality tests

Let us now consider the fundamental problem of testing disturbance normality in the context of the linear regression model: Y = Xp + u, (23.12) where Y = (y1, ..., …

Univariate Forecasts

Univariate forecasts are made solely using past observations on the series being forecast. Even if economic theory suggests additional variables that should be useful in forecasting a particular variable, univariate …

Further Research on Cointegration

Although the discussion in the previous sections has been confined to the pos­sibility of cointegration arising from linear combinations of I(1) variables, the literature is currently proceeding in several interesting …

Как с нами связаться:

тел./факс +38 05235  77193 Бухгалтерия
+38 050 512 11 94 — гл. инженер-менеджер (продажи всего оборудования)

+38 050 457 13 30 — Рашид - продажи новинок
e-mail: msd@msd.com.ua
Схема проезда к производственному офису:
Схема проезда к МСД

Партнеры МСД

Контакты для заказов шлакоблочного оборудования:

+38 096 992 9559 Инна (вайбер, вацап, телеграм)
Эл. почта: inna@msd.com.ua

За услуги или товары возможен прием платежей Онпай: Платежи ОнПай