Page 211 - Elements of Distribution Theory
P. 211

P1: JZP
            052184472Xc06  CUNY148/Severini  May 24, 2005  2:41





                                                      6.7 Exercises                          197

                        6.12 Let {X t : t ∈ Z} denote a finite moving average process of the form
                                                             q

                                                        X t =  α j   t− j
                                                             j=0
                                                                                               2
                            where   0 ,  1 ,... are uncorrelated random variables each with mean 0 and finite variance σ .
                            Suppose that
                                                        1
                                                  α j =    ,  j = 0, 1,..., q.
                                                       q + 1
                            Find the autocorrelation function of the process.
                        6.13 Let   −1 ,  0 ,  1 ,... denote independent random variables, each with mean 0 and standard devi-
                            ation 1. Define
                                                 X t = α 0   t + α 1   t−1 , t = 0, 1,...
                            where α 0 and α 1 are constants. Is the process {X t : t ∈ Z} a Markov process?
                        6.14 Consider a Markov chain with state space {1, 2,..., J}.A state i is said to communicate with
                            a state j if

                                                 P ij (n)  > 0  for some n = 0, 1,...
                            and
                                                 P  (n)  > 0  for some n = 0, 1,....
                                                  ji
                            Show that communication is an equivalence relation on the state space. That is, show that a
                            state i communicates with itself, if i communicates with j then j communicates with i, and if
                            i communicates with j and j communicates with k, then i communicates with k.
                        6.15 Let {X t : t ∈ T } denote a Markov chain with state space {1, 2,..., J}.For each t = 0, 1,...,
                            let Y t = (X t , X t+1 ) and consider the stochastic process {Y t : t ∈ Z}, which has state space
                                                     {1,..., J}×{1,..., J}.
                            Is {Y t : t ∈ T } a Markov chain?
                        6.16 Let Y 1 , Y 2 ,... denote independent, identically distributed random variables, such that
                                                  Pr(Y 1 = j) = p j ,  j = 1,..., J,
                            where p 1 + ··· + p j = 1. For each t = 1, 2,..., let
                                                      X t = max{Y 1 ,..., Y t }.
                            Is {X t : t ∈ Z} a Markov chain? If so, find the transition probability matrix.
                        6.17 Let P denote the transition probability matrix of a Markov chain and suppose that P is doubly
                            stochastic; that is, suppose that the rows and columns of P both sum to 1. Find the stationary
                            distribution of the Markov chain.
                        6.18 A counting process {N(t): t ≥ 0} is said to be a nonhomogeneous Poisson process with intensity
                            function λ(·)if the process satisfies (PP1) and (PP2) and, instead of (PP3), for any nonnegative
                            s, t, N(t + s) − N(s) has a Poisson distribution with mean
                                                            t+s
                                                             λ(u) du.
                                                          s
                            Assume that λ(·)isa positive, continuous function defined on [0, ∞).
                            Find an increasing, one-to-one function h :[0, ∞) 
→ [0, ∞) such that { ˜ N(t): t ≥ 0} is a Poisson
                            process, where ˜ N(t) = N(h(t)).
   206   207   208   209   210   211   212   213   214   215   216