Page 315 -
P. 315
314 L.R Izquierdo et al.
Definition 4 Closed Communicating Class (i.e. Absorbing Class): Absorbing
State A communicating class C is said to be closed if no state within C leads to
any state outside C. Formally, a communicating class C is said to be closed if i 2 C
and j 2 C imply that j is not accessible from i.
Note that once a Markov chain visits a closed communicating class, it cannot
leave it. Hence we will sometimes refer to closed communicating classes as
‘absorbing classes’. This latter term is not standard in the literature, but we find
it useful here for explanatory purposes. Note that if a Markov chain has one single
communicating class, it must be closed.
As an example, note that the communicating classes fs 10 g and fs 1 , s 3 , s 4 , s 6 ,
s 7 , s 8 , s 9 , s 11 , s 12 g in the THMC represented in Fig. 13.10 are not closed, as they
can be abandoned. On the other hand, the communicating classes fs 2 g and fs 5 g
are indeed closed, since they cannot be abandoned. When a closed communicating
class consists of one single state, this state is called absorbing. Thus, s 2 and s 5 are
absorbing states. Formally, state i is absorbing if and only if p i,i D 1 and p i,j D 0for
i ¤ j.
Proposition 2 Decomposition Theorem (Chung 1960) The state space S of any
Markov chain can be uniquely partitioned as follows:
S D C 1 \ C 2 \ \ C k \ T
where C 1 , C 2 , ::: , C k are closed communicating classes, and T is the union of
all other communicating classes.
Note that we do not distinguish between non-closed communicating classes: we
lump them all together into T. Thus, the unique partition of the THMC represented
in Fig. 13.10 is S Dfs 2 g\fs 5 g\fs 1 , s 3 , s 4 , s 6 , s 7 , s 8 , s 9 , s 10 , s 11 , s 12 g.The simple
random walk model presented in Sect. 13.7.1 has one single (closed) communicating
class C 1 containing all the possible states, i.e. S
C 1 .
Definition 5 Irreducibility A Markov chain is said to be irreducible if all its states
belong to a single closed communicating class; otherwise it is called reducible.
Thus, the simple random walk presented in Sect. 13.7.1 is irreducible, but the
THMC represented in Fig. 13.10 is reducible.
Definition 6 Transient and Recurrent States A state i is said to be transient if,
given that we start in state i, there is a non-zero probability that we will never return
back to i. Otherwise, the state is called recurrent. A Markov chain starting from a
recurrent state will revisit it with probability 1 and hence revisit it infinitely often.
On the other hand, a Markov chain starting from a transient state has a strictly
positive probability of never coming back to it. Thus, a Markov chain will visit
any transient state only finitely many times; eventually, transient states will not be
revisited anymore.
Definition 7 Periodic and Aperiodic States: Periodic and Aperiodic Communi-
cating Classes A state i has period d if any return to state i must occur in multiples
of d time-steps. If d D 1, then the state is said to be aperiodic; otherwise (d >1), the