Page 227 - A First Course In Stochastic Models
P. 227

220                    MARKOV CHAINS AND QUEUES

                • A fixed number of M identical customers move around in the network.
                • Each station is a single-server station with ample waiting room and at each
                  station service is in order of arrival.
                • The service times of the customers at the different visits to the stations are
                  independent of each other, and the service time of a customer at station j has
                  an exponential distribution with mean 1/µ j for j = 1, . . . , K.
                • Upon service completion at station i, the served customer moves with probability
                  p ij to station j for j = 1, . . . , K, where    K  p ij = 1 for all i = 1, . . . , K.
                                                      j=1

                  The routing matrix P = (p ij ), i, j = 1, . . . , K is assumed to be an irreducible
                Markov matrix. Since the Markov matrix P is irreducible, its equilibrium distribu-
                tion {π j } is the unique positive solution to the equilibrium equations

                                         K

                                    π j =   π i p ij ,  j = 1, . . . , K    (5.6.11)
                                         i=1
                                                         K
                in conjunction with the normalizing equation  j=1 j = 1. The relative visit
                                                            π
                frequencies to the stations are proportional to these equilibrium probabilities. To
                see this, let
                       λ j = the long-run average arrival rate of customers at station j.
                Since λ i is also the rate at which customers depart from station i, we have that
                λ i p ij is the rate at which customers arrive at station j from station i. This gives
                the traffic equations
                                         K

                                    λ j =   λ i p ij ,  j = 1, . . . , K.   (5.6.12)
                                         i=1
                The solution of the equilibrium equations (5.6.11) of the Markov matrix P is unique
                up to a multiplicative constant. Hence, for some constant γ > 0,

                                      λ j = γ π j ,  j = 1, . . . , K.      (5.6.13)
                Denote by X j (t) the number of customers present at station j at time t. The process
                {(X 1 (t), . . . , X K (t))} is a continuous-time Markov chain with the finite state space
                I = {(n 1 , . . . , n K ) | n i ≥ 0,    K  n i = M}.
                                         i=1
                Theorem 5.6.2 The equilibrium distribution of the continuous-time Markov chain
                {X(t) = (X 1 (t), . . . , X K (t))} is given by

                                                     K 	   
 n k
                                                         π k

                                    p(n 1 , . . . , n K ) = C               (5.6.14)
                                                         µ k
                                                    k=1
                for some constant C > 0.
   222   223   224   225   226   227   228   229   230   231   232