Page 191 - Probability, Random Variables and Random Processes
P. 191
CHAP. 51 RANDOM PROCESSES
By definition (2.28),
Thus, Cov[X(t), X(s)] = +{Var[X(t)] + var[X(s)] - Var[X(t) - X(s)l)
Using Eqs. (3.1 01) and (5.1 02), we obtain
$a12[t + s - (t - s)] = a12s t > s
KX(4 s) =
$a12[t + s - (S - t)] = a12t s > t
or
where aI2 = Var[X(l)].
5.24. (a) Show that a simple random walk X(n) of Prob. 5.2 is a Markov chain.
(b) Find its one-step transition probabilities.
(a) From Eq. (5.73) (Prob. 5.10), X(n) = {X,, n 2 0) can be expressed as
where Z, (n = 1,2, . . .) are iid r.v.'s with
P(Z,=k)=ak ( 1 -1 and a,=p a-,=q=l-p
Then X(n) = {X,, n 2 0) is a Markov chain, since
P(X,+l=i,+l~X,=O,Xl=i ,,..., X,=i,)
= P(Z,+, + in = in+, lXo = 0, X, = i,, ..., X, = in)
= P(Z,+l = in+, -in) = ain+i-in = P(X,+, = in+, IX, = in)
since Z,+ , is independent of X,, X,, . . . , X,.
(b) The one-step transition probabilities are given by
k=j+l
pjk=P(X,=klX,-I =j)= 1 -p k=j-1
otherwise
which do not depend on n. Thus, a simple random walk X(n) is a homogeneous Markov chain.
5.25. Show that for a Markov process X(t), the second-order distribution is suficient to characterize
Let X(t) be a Markov process with the nth-order distribution
Then, using the Markov property (5.26), we have
Applying the above relation repeatedly for lower-order distribution, we can write