Page 196 - Probability, Random Variables and Random Processes
P. 196
RANDOM PROCESSES [CHAP 5
that is,
2 (0.7)" 1 (0.7)"
P(X, = 0) = - - - and P(X, = 1) = - + -
3 6 3 6
(b) Since lim,,,(0.7)" = 0, the distribution of X, when n -, oo is
P(X, = 0) = 3 and P(X, = 1) = 3
Verify the transitivity property of the Markov chain ; that is, if i -+ j and j -+ k, then i -+ k.
By definition, the relations i + j and j -, k imply that there exist integers n and m such that pip) > 0
and pi:") > 0. Then, by the Chapman-Kolmogorov equation (5.38), we have
= 1 pir*)prk(") 2 pij(n)pjk(m) > 0 (5.1 22)
r
Therefore i -, k.
Verify Eq. (5.42).
If the Markov chain (X,} goes from state i to state j in m steps, the first step must take the chain from i
to some state k, where k # j. Now after that first step to k, we have m - 1 steps left, and the chain must get
to state j, from state k, on the last of those steps. That is, the first visit to state j must occur on the (m - 1)st
step, starting now in state k. Thus we must have
Show that in a finite-state Markov chain, not all states can be transient.
Suppose that the states are 0, 1, . . . , m, and suppose that they are all transient. Then by definition, after
a finite amount of time (say To), state 0 will never be visited; after a finite amount of time (say TI), state 1
will never be visited; and so on. Thus, after a finite time T = max{T,, TI, . . . , T,), no state will be visited.
But as the process must be in some state after time T, we have a contradiction. Thus, we conclude that not
all states can be transient and at least one of the states must be recurrent.
A state transition diagram of a finite-state Markov chain is a line diagram with a vertex corre-
sponding to each state and a directed line between two vertices i and j if pij > 0. In such a
diagram, if one can move from i and j by a path following the arrows, then i + j. The diagram is
useful to determine whether a finite-state Markov chain is irreducible or not, or to check for
periodicities. Draw the state transition diagrams and classify the states of the Markov chains
with the following transition probability matrices:
(a) P = 0.5 1
[:5 I., "1