Page 94 - A First Course In Stochastic Models
P. 94
86 DISCRETE-TIME MARKOV CHAINS
p i0 = P {the demand in the coming week is S or more}
∞ k S−1 k
−λ λ −λ λ
= e = 1 − e .
k! k!
k=S k=0
The following example illustrates the powerful technique of embedded Markov
chains. Many stochastic processes can be analysed by using properly chosen embed-
ded stochastic processes that are discrete-time Markov chains. A classic example
is the single-server M/G/1 queue with Poisson arrivals and general service times.
The embedded process describing the number of customers left behind at the ser-
vice completion epochs is a discrete-time Markov chain; see also Section 2.5.
Another example is provided by the ‘dual’ queue with general interarrival times
and exponential service times.
Example 3.1.3 The GI /M/1 queue
Customers arrive at a single-server station according to a renewal process, that is,
the interarrival times of the customers are independent and identically distributed
random variables. It is assumed that the interarrival time has a probability den-
sity a(t). A customer who finds upon arrival that the server is idle enters service
immediately; otherwise the customer waits in line. The service times of the suc-
cessive customers are independent random variables having a common exponential
distribution with mean 1/µ. The service times are also independent of the arrival
process. A customer leaves the system upon service completion. This queueing
system is usually abbreviated as the GI/M/1 queue. For any t ≥ 0, define the
random variable X(t) by
X(t) = the number of customers present at time t.
The continuous-time stochastic process {X(t), t ≥ 0} does not possess the Marko-
vian property that the future behaviour of the process depends only on its present
state. Clearly, to predict the future behaviour of the process, the knowledge of the
number of customers present does not suffice in general but the knowledge of the
time elapsed since the last arrival is required too. Note that, by the memoryless
property of the exponential distribution, the elapsed service time of the service
in progress (if any) is not relevant. However, we can find an embedded Markov
chain for the continuous-time process {X(t)}. Consider the process embedded at
the epochs when customers arrive. At these epochs the time elapsed since the last
arrival is known and equals zero. Define for n = 0, 1, . . . ,
X n = the number of customers present just prior to the nth arrival epoch
with X 0 = 0 by convention. The embedded stochastic process {X n , n = 0, 1, . . . }
is a discrete-time Markov chain, since the exponential services are memoryless.
This Markov chain has the countably infinite state space I = {0, 1, . . . }. To find
the one-step transition probabilities p ij of the Markov chain, denote by A n the