Page 211 - Applied Probability
P. 211

9. Descent Graph Methods
                              196
                                   locus there are two alleles A and a. A plant can have any of the three
                                   genotypes A/A, A/a,or a/a. Define a Markov chain with three states
                                   giving the genotype of the current plant in the selfing scheme. Show
                                   that the nth power of the transition matrix is
                                                                                  
                                                            1       0        0
                                                                    1 n
                                                            1 n+1
                                             P n  =   1  − ( )    ( )   1  − ( )    .
                                                                              1 n+1 
                                                        2   2       2    2    2
                                                            0       0        1
                                                    n
                                   What is lim n→∞ P ? Demonstrate that this Markov chain has mul-
                                   tiple equilibrium distributions and characterize them.
                                 4. Find a transition matrix P such that lim n→∞ P  n  does not exist.
                                 5. For an irreducible chain, demonstrate that aperiodicity is a necessary
                                   and sufficient condition for some power P  n  of the transition matrix
                                   P to have all entries positive. (Hint: For sufficiency, you may use the
                                   following number theoretic fact: Suppose S is a set of positive integers
                                   that is closed under addition and has greatest common divisor 1. Then
                                   there exists an integer m such that n ∈ S whenever n ≥ m.)
                                 6. Let Z 0 ,Z 1 ,Z 2 ,... be a realization of an ergodic chain. If we sample
                                   every kth epoch, then show (a) that the sampled chain Z 0 ,Z k ,Z 2k ,...
                                   is ergodic, (b) that it possesses the same equilibrium distribution as
                                   the original chain, and (c) that it is reversible if the original chain is.
                                   Thus, we can estimate theoretical means by sample averages using
                                   only every kth epoch of the original chain.

                                 7. The Metropolis acceptance mechanism (9.6) ordinarily implies ape-
                                   riodicity of the underlying Markov chain. Show that if the proposal
                                   distribution is symmetric and if some state i has a neighboring state
                                   j such that π i >π j , then the period of state i is 1, and the chain,
                                   if irreducible, is aperiodic. For a counterexample, assign probability
                                   π i =  1  to each vertex i of a square. If the two vertices adjacent to a
                                        4
                                                                                1
                                   given vertex i are each proposed with probability , then show that
                                                                                2
                                   all proposed steps are accepted by the Metropolis criterion and that
                                   the chain is periodic with period 2.
                                 8. If the component updated in Gibbs sampling depends probabilis-
                                   tically on the current state of the chain, how must the Hastings-
                                   Metropolis acceptance probability be modified to preserve detailed
                                   balance? Under the appropriate modification, the acceptance proba-
                                   bility is no longer always 1.
                                 9. Importance sampling is one remedy when the states of a Markov chain
                                   communicate poorly [13]. Suppose that π is the equilibrium distribu-
                                   tion of the chain. If we sample from a chain whose distribution is ν,
   206   207   208   209   210   211   212   213   214   215   216