Page 97 - A First Course In Stochastic Models
P. 97
TRANSIENT ANALYSIS 89
given by
0.70 0.10 0.20
0.50
P = 0.25 0.25 .
0.40 0.30 0.30
To obtain the probability of having sunny weather three days from now, we need
3
the matrix product P :
0.6015000 0.1682500 0.2302500
3 0.5912500
P = 0.1756250 0.2331250 .
0.5855000 0.1797500 0.2347500
This matrix shows that it will be sunny three days from now with probability 0.5855
when the present day is rainy. You could also ask: what is the probability distri-
bution of the weather after many days? Intuitively you expect that this probability
distribution does not depend on the present weather. This is indeed confirmed by
the calculations:
0.5963113 0.1719806 0.2317081
5
P = 0.5957781 0.1723641 0.2318578
0.5954788 0.1725794 0.2319418
0.5960265 0.1721854 0.2317881
P 12 = 0.5960265 0.1721854 0.2317881 = P 13 = P 14 = . . . .
0.5960265 0.1721854 0.2317881
(n)
In this example the n-step transition probability p converges for n → ∞ to a
ij
limit which is independent of the initial state i. You see that the weather after many
days will be sunny, cloudy or rainy with respective probabilities 0.5960, 0.1722 and
0.2318. Intuitively it will be clear that these probabilities also give the proportions
of time the weather is sunny, cloudy and rainy over a long period. The limiting
behaviour of the n-step transition probabilities is the subject of Section 3.3.
3.2.1 Absorbing States
A useful Markov chain model is the model with one or more absorbing states. A
state is absorbing if the process cannot leave this state once it entered this state.
Definition 3.2.1 A state i is said to be an absorbing state if p ii = 1.
The next example shows the usefulness of the Markov model with absorbing states.
Example 3.2.2 Success runs in roulette
A memorable event occurred in the casino of Monte Carlo on the evening of 18
August 1913. The roulette ball hit a red number 26 times in a row. In European