Page 103 - A First Course In Stochastic Models
P. 103
TRANSIENT ANALYSIS 95
Lemma 3.2.3 Suppose that state j is transient. Then, for any state i ∈ I,
(n)
lim p ij = 0.
n→∞
∞ (n) (n)
Proof By (3.2.10), n=1 p jj < ∞ and thus lim n→∞ p jj = 0. Take now a
starting state i with i = j. By conditioning on the first epoch at which the process
makes a transition into state j, we obtain the useful relation
n
(n) (n−k) (k)
p = p f , n = 1, 2, . . . . (3.2.12)
ij jj ij
k=1
(n)
∞ (k)
Since lim n→∞ p exists and f = f ij < ∞, it follows from the bounded
jj k=1 ij
convergence theorem in Appendix A that
(n) (n)
lim p ij = f ij lim p jj . (3.2.13)
n→∞ n→∞
(n)
Since lim n→∞ p = 0, the lemma now follows.
jj
(n)
The limiting behaviour of p as n → ∞ for a recurrent state j will be dis-
ij
cussed in Section 3.3. It will be seen that this limit does not always exist. For a
recurrent state j an important concept is the mean recurrence time µ jj which is
defined by
∞
(n)
µ jj = nf . (3.2.14)
jj
n=1
In other words, µ jj is the expected number of transitions needed to return from
state j to itself. A recurrent state j is said to be positive recurrent if µ jj < ∞
and is said to be null-recurrent if µ jj = ∞. In Section 3.5 it will be seen that
null-recurrency can only occur in Markov chains with an infinite state space. To
illustrate this, consider the Markov chain {X n } describing the drunkard’s walk on
an infinite square in Example 3.1.1 (N = ∞). It can be shown for this infinite-
state random walk that each state (x, y) is recurrent, but the mean recurrence time
of each state is ∞ so that all states are null-recurrent. The same holds for the
infinite-state Markov chain describing the symmetric random walk on the integers
1
(p i,i+1 = p i,i−1 = for any integer i). However, for the symmetric random
2
walk on an infinite lattice in three or more dimensions, the corresponding Markov
chain has the property that all states are transient (in three dimensions, the prob-
ability of ever returning to the origin when starting there equals 0.3405). These
remarkable results will not be proved here, but are mentioned to show that Markov
chains with an infinite state space are intrinsically more complex than finite-state
Markov chains.