Page 188 - Elements of Distribution Theory
P. 188

P1: JZP
            052184472Xc06  CUNY148/Severini  May 24, 2005  2:41





                            174                         Stochastic Processes

                            The process is said to be second-order stationary,or covariance stationary,if µ t+h and
                            K(s + h, t + h)do not depend on h. Hence, {X t : t ∈ Z} is covariance stationary provided
                            that µ t is constant, that is, does not depend on t, and K(s, t) depends on s, t only through
                            the difference |s − t|;in this case we write µ t = µ and K(s, t) ≡ R(|s − t|) for some
                            function R on Z. The function R is known as the autocovariance function of the process;
                            the autocorrelation function of the process is also useful and is given by

                                                  ρ(t) = R(t)/R(0), t = 0, 1,....

                            Example 6.4 (Moving differences). Let {Z t : t ∈ Z} denote a covariance-stationary pro-
                            cess and define
                                                   X t = Z t+1 − Z t , t = 0, 1,....

                            According to Example 6.3, {X t : t ∈ Z} is stationary provided that {Z t : t ∈ Z} is stationary;
                            here we assume only that {Z t : t ∈ Z} is covariance stationary.
                              Clearly, E(X t ) = 0 for all t. Let Cov(Z t , Z s ) = R Z (|t − s|); then
                                         Cov(X t , X t+h ) = 2R Z (|h|) − R Z (|h − 1|) − R Z (|h + 1|).

                            Since |h − 1|=| − h + 1|,it follows that Cov(X t , X t+h ) depends on h only through |h| so
                            that {X t : t ∈ Z} is covariance stationary.


                            Example 6.5 (Partial sums). Let Z 0 , Z 1 , Z 2 ,... denote a sequence of independent, iden-
                            tically distributed random variables, each with mean 0 and standard deviation σ. Consider
                            the process defined by
                                                 X t = Z 0 + ··· + Z t , t = 0, 1,....

                                                        2
                            Then E(X t ) = 0 and Var(X t ) = tσ ; hence, the process is not stationary.
                              The variance of the process can be stabilized by considering
                                                      Z 0 +· · · + Z t
                                                  Y t =  √        , t = 1, 2,...
                                                          (t + 1)
                                                              2
                            which satisfies E(Y t ) = 0 and Var(Y t ) = σ for all t = 0, 1, 2,.... However,
                                                             σ  2
                                           Cov(Y t , Y s ) = √        min(s + 1, t + 1)
                                                         [(s + 1)(t + 1)]
                            so that {Y t : t ∈ Z} is not covariance stationary.




                                                  6.3 Moving Average Processes
                            Let ...,  −1 ,  0 ,  1 ,... denote a doubly infinite sequence of independent random variables
                            such that, for each j,E(  j ) = 0 and Var(  j ) = 1 and let ...,α −1 ,α 0 ,α 1 ,... denote a doubly
                            infinite sequence of constants. Consider the stochastic process {X t : t ∈ Z} defined by

                                                        ∞

                                                  X t =    α j   t− j , t = 0, 1,....           (6.1)
                                                       j=−∞
   183   184   185   186   187   188   189   190   191   192   193