Page 95 - A First Course In Stochastic Models
P. 95
TRANSIENT ANALYSIS 87
time between the arrival epochs of the nth and (n + 1)th customer and let C n
denote the number of customers served during the interarrival time A n . Note that
X n+1 = X n + 1 − C n . The probability distribution of C n obviously depends on
X n (= the number of customers seen by the nth arrival). The easiest way to find
the probability distribution of C n is to use the observation that service completions
occur according to a Poisson process with rate µ as long as the server is busy.
This observation is a consequence of the assumption of exponentially distributed
service times and the relation between the Poisson process and the exponential
distribution. By conditioning on the interarrival time A n and using the law of total
probability, we find for each state i that
p ij = P {X n+1 = j | X n = i}
∞
= P {i + 1 − j service completions during A n | A n = t}a(t) dt
0
∞ (µt)
i+1−j
= e −µt a(t) dt, 1 ≤ j ≤ i + 1. (3.1.2)
0 (i + 1 − j)!
This formula does not hold for j = 0. Why not? The probability p i0 is easiest to
compute from
i+1
p i0 = 1 − p ij , i = 0, 1, . . . .
j=1
Obviously, p ij = 0 for j > i + 1 for each state i.
3.2 TRANSIENT ANALYSIS
This section deals with the transient analysis of the Markov chain {X n , n =
0, 1, . . . } with state space I and one-step transition probabilities p ij for i, j ∈ I.
We first show how the one-step transition probabilities determine the probability of
going from state i to state j in the next n steps. The n-step transition probabilities
are defined by
(n)
p = P {X n = j | X 0 = i}, i, j ∈ I
ij
(1)
for any n = 1, 2, . . . Note that p = p ij . It is convenient to define
ij
(0) 1 if j = i,
p =
ij 0 if j = i.
Theorem 3.2.1 (Chapman–Kolmogoroff equations) For all n, m = 0, 1, . . . ,
(n+m) (n) (m)
p = p p , i, j ∈ I. (3.2.1)
ij ik kj
k∈I