Page 237 - A First Course In Stochastic Models
P. 237
230 MARKOV CHAINS AND QUEUES
(a) Verify that
µf 0 (1 − z)
F(z) = .
µ(1 − z) − λz(1 − β(z))
j
(b) Use Theorem C.1 in Appendix C to prove that f j ∼ γ η as j → ∞ for some constant
γ , where η is the reciprocal of the smallest root of µ(1 − x) − λx(1 − β(x)) = 0 on (1, R).
(c) Verify that 1 − W q (x) ∼ γ 1 e −µ(1−η)x as x → ∞ for some constant γ 1 > 0.
5.27 Consider the so-called MAP/G/1 queue with a Markov modulated Poisson arrival
process (an important application of this model in teletraffic analysis is the buffering of
independent on-off sources at a statistical multiplexer). The arrival rate of customers is
governed by an exogenous phase process. The phase process is a continuous-time Markov
chain with finitely many states s = 0, 1, . . . , m and infinitesimal transition rates α st . It is
assumed that the phase process is irreducible and thus has a unique equilibrium distribution
which is denoted by {e s }. If the phase process is in state s, customers arrive according to
a Poisson process with rate λ s . The service times of the customers are independent random
variables which are also independent of the arrival process. Customers are served in order of
arrival. It is assumed that the service time of a customer has the same probability distribution
! "
m −1 ∞
λ
function (5.5.3) as in Example 5.5.1. Letting ρ = s=0 s e s × µ j=1 jβ j , it is
assumed that the server utilization ρ is less than 1. Also it is assumed that the convergence
∞
j
β
radius R of the power series β(z) = j=1 j z is larger than 1.
(a) Let p(i, s) denote the joint equilibrium probability that i customers are present and
the arrival process is in phase s. Verify that for any s there is a constant γ s such that
i
p(i, s) ∼ γ s η as i → ∞,
where η is the reciprocal of the smallest root τ of det A(x) = 0 on (1, R). Here the
(m + 1) × (m + 1) matrix A(z) is given by
T
A(z) = µ(1 − z)I − z(1 − β(z)) + zQ ,
where is the diagonal matrix =diag(λ 0 , λ 1 , . . . , λ m ) and Q T is the transpose of the
transition matrix Q = (q st ), s, t = 0, 1, . . . , m with q st = α st for t = s and q ss =
− t =s α st . For the special case of m = 1 with α 01 = ω 1 and α 10 = ω 2 (switched Poisson
process), verify that the determination of τ reduces to finding the smallest root of
2
[(λ 1 + µ + ω 1 )z − λ 1 zβ(z) − µ][(λ 2 + µ + ω 2 )z − λ 2 zβ(z) − µ] − ω 1 ω 2 z = 0
on the interval (1, R). Conclude that the geometric tail approach can be applied to calculate
the state probabilities p(i, s).
(b) Let π j denote the long-run fraction of customers who find j other customers present
m m
λ
λ
upon arrival. Argue that π j = s=0 s p (j, s) / s=0 s e s .
(c) Let W q (x) denote the limiting probability distribution function of the delay in queue
of a customer. Verify that 1 − W q (x) ∼ γ e −µ(1−η)x as x → ∞ for some constant γ .
BIBLIOGRAPHIC NOTES
Queueing problems have laid the foundation for the continuous-time Markov chain
model. The Erlang delay model and the Erlang loss model stem from teletraffic
analysis. The square-root rule is discussed in many papers and was obtained by
A.K. Erlang in an unpublished paper in 1924. Recommended references are Borst