Page 193 - Elements of Distribution Theory
P. 193
P1: JZP
052184472Xc06 CUNY148/Severini May 24, 2005 2:41
6.3 Moving Average Processes 179
Similarly, there exists an N 2 such that
−(n+1)
2
α < /2, n, m > N 2 .
j
j=−m
Hence, given > 0, there exists an N such that
2
E[(X nt − X mt ) ] < , n, m > N.
That is, for each t = 0, 1,..., X nt , n = 1, 2,..., is Cauchy in mean-square. The result
now follows from Theorem 6.4.
The autocovariance function of a moving average process is given in the following
theorem.
Theorem 6.6. Let {X t : t ∈ Z} be a moving average process of the form
∞
X t = α j t− j , t = 0, 1,...
j=−∞
where ..., −1 , 0 , 1 ,... is a sequence of independent random variables such that E( j ) =
0 and Var( j ) = 1,j = ..., −1, 0, 1,..., and
∞
2
α < ∞.
j
j=−∞
Then {X t : t ∈ Z} is a second-order stationary process with mean 0 and autocovariance
function
∞
R(h) = α j α j+h , h = 0, ±1, ±2,....
j=−∞
If ..., −1 , 0 , 0 ,... are identically distributed, then {X t : t ∈ Z} is stationary.
Proof. Fix h = 0, 1,... and define Y t = X t+h , t = 0, 1,.... Then
∞ ∞
Y t = α j t+h− j = α j+h t− j .
j=−∞ j=−∞
Fix t and define
n
X nt = α j t− j
j=−n
and
n
Y nt = α j+h t− j .
j=−n
Then
n
X nt − Y nt = (α j − α j+h ) t− j ;
j=−n