Page 91 - A First Course In Stochastic Models
P. 91

THE MODEL                            83

                        (−N, N )                                       (N, N )











                                                   (0, 0)









                      (−N, −N )                                        (N, −N )

                                 Figure 3.1.1 The drunkard’s random walk


                process {X n } has the Markovian property. We are now ready to give the general
                definition of a Markov chain.
                  Let {X n , n = 0, 1, . . . } be a sequence of random variables with state space I. We
                interpret the random variable X n as the state of some dynamic system at time n.
                The set of possible values of the process is denoted by I and is assumed to be
                finite or countably infinite.


                Definition 3.1.1 The stochastic process {X n , n = 0, 1, . . . } with state space I is
                called a discrete-time Markov chain if, for each n = 0, 1, . . . ,

                  P {X n+1 = i n+1 | X 0 = i 0 , . . . , X n = i n } = P {X n+1 = i n+1 | X n = i n } (3.1.1)

                for all possible values of i 0 , . . . , i n+1 ∈ I.

                  In the following, we consider only Markov chains with time-homogeneous tran-
                sition probabilities; that is, we assume that

                                 P {X n+1 = j | X n = i} = p ij ,  i, j ∈ I,

                independently of the time parameter n. The probabilities p ij are called the one-step
                transition probabilities and satisfy


                             p ij ≥ 0,  i, j ∈ I,  and  p ij = 1,  i ∈ I.
                                                    j∈I
   86   87   88   89   90   91   92   93   94   95   96