Page 142 - A First Course In Stochastic Models
P. 142
134 DISCRETE-TIME MARKOV CHAINS
EXERCISES
3.1 A production machine has two crucial parts which are subject to failures. The two parts
are identical. The machine works as long as one of the two parts is functioning. A repair is
done when both parts have failed. A repair takes one day and after each repair the system is
as good as new. An inspection at the beginning of each day reveals the exact condition of
each part. If at the beginning of a day both parts are in good condition, then at the end of the
day both parts are still in good condition with probability 0.50, one of them is broken down
with probability 0.25 and both are broken down with probability 0.25. If at the beginning
of the day only one part is in good condition, this part is still in good condition at the end
of the day with probability 0.50. Define a Markov chain to describe the functioning of the
machine and specify the one-step transition probabilities.
3.2 To improve the reliability of a production system, two identical production machines are
connected in parallel. For the production process only one of the machines is used; the other
machine is standby. At the end of the day the used machine is inspected. Regardless how
long the machine has already been in uninterrupted use, the probability that an inspection
1
reveals the necessity for revision is 10 . A revision takes exactly two days. During the revision
the other machine takes over the production if that machine is available. The production
process must be stopped when both machines are in revision. Assuming that there are two
repairmen, define an appropriate Markov chain to describe the functioning of the production
system and specify the one-step transition probabilities of the Markov chain.
3.3 Containers are temporarily stored at a stockyard with ample capacity. At the beginning
of each day precisely one container arrives at the stockyard. Each container stays a certain
amount of time at the stockyard before it is removed. The residency times of the contain-
ers are independent of each other. Specify for each of the following two cases the state
variable(s) and the one-step transition probabilities of a Markov chain that can be used to
analyse the number of containers present at the stockyard at the end of each day.
(a) The residency time of a container is exponentially distributed with a mean of 1/µ
days.
(b) The residency time of a container has an exponential distribution whose mean is 1/µ 1
days with probability p and is 1/µ 2 days with probability 1 − p.
3.4 Two teams, A and B, meet each other in a series of games until either of the teams has
won three games in a row. Each game results in a win for either of the teams (no draw is
possible). The outcomes of the games are independent of each other. Define an appropriate
Markov chain to determine the probability distribution of the length of the match when the
two teams are equally strong.
3.5 Consider Exercise 3.4 again, but assume now that team A wins a given game with a
probability larger than 1 2 .
(a) Use Markov chain analysis to determine the probability distribution of the length of
the match. Explain how to calculate the probability that team A wins the match.
(b) Explain how to modify the Markov chain analysis when a draw between the teams is
possible with positive probability?
3.6 You play the following game. A fair coin is flipped until heads appears three times in a
row. You get $12 each time this happens, but you have to pay $1 for each flip of the coin.
Use Markov chain analysis to find out whether this game is fair.
3.7 Consider the following variant of the coupon-collecting problem. A fair die is thrown
until each of the six possible outcomes 1, 2, . . . , 6 has appeared. Use a Markov chain with
seven states to calculate the probability distribution of the number of throws needed.
3.8 The gambler Joe Dalton has $100 and his goal is to double this amount. Therefore he
plays a gambling game in which he loses his stake with probability 0.60, but wins two or