Page 224 - Applied Probability
P. 224

10. Molecular Phylogeny
                              210
                              where I is the identity matrix. It is easy to check that the solution of the
                              initial value problem (10.6) is furnished by the matrix exponential
                                                                ∞
                                                                	 k
                                                                   t Λ
                                                          tΛ
                                                             =
                                                  P(t)= e
                                                                    k! k  .               (10.7)
                                                                k=0
                              Probabilists call Λ the infinitesimal generator or infinitesimal tran-
                              sition matrix of the process.
                                A probability distribution π =(π i ) on the states of a Markov chain is a
                              row vector whose components satisfy π i ≥ 0 for all i and     π i =1. If
                                                                                   i
                                                        πP(t)= π                          (10.8)
                              holds for all t ≥ 0, then π is said tobean equilibrium distribution for
                              the chain. Written in components, the eigenvector equation (10.8) reduces
                              to     π i p ij (t)= π j . Again, this is completely analogous to the discrete-
                                   i
                              time theory described in Chapter 9. For small t, equation (10.8) can be
                              rewritten as
                                                    π(I + tΛ) + o(t)= π.
                              This approximate form makes it obvious that πΛ= 0 is a necessary condi-
                              tion for π to be an equilibrium distribution. Multiplying (10.7) on the left
                              by π shows that πΛ= 0 is also a sufficient condition for π to be an equi-
                              librium distribution. In components, this necessary and sufficient condition
                              amounts to


                                                       π j λ ji  = π i  λ ij              (10.9)
                                                    j =i            j =i
                              for all i. If all the states of a Markov chain communicate, then there is one
                              and only one equilibrium distribution π. Furthermore, each of the rows of
                              P(t) approaches π as t →∞. Lamperti [16] provides a clear exposition of
                              these facts.
                                Fortunately, the annoying feature of periodicity present in discrete-time
                              theory disappears in the continuous-time theory. The definition and proper-
                              ties of reversible chains carry over directly from discrete time to continuous
                              time provided we substitute infinitesimal transition probabilities for tran-
                              sition probabilities. For instance, the detailed balance condition becomes


                                                        π i λ ij  = π j λ ji             (10.10)
                              for all pairs i  = j. Kolmogorov’s circulation criterion for reversibility contin-
                              ues to hold, and when it is true, the equilibrium distribution is constructed
                              from the infinitesimal transition probabilities exactly as in discrete time.
   219   220   221   222   223   224   225   226   227   228   229