Page 160 - Elements of Distribution Theory
P. 160

P1: JZP
            052184472Xc05  CUNY148/Severini  May 24, 2005  17:53





                            146                   Parametric Families of Distributions

                                            n

                            provided that t =  y j . This probability simplifies to
                                             j=1
                                                               1
                                                               n

                                                               t
                                                                           n
                            for all y 1 ,..., y n taking values in the set {0, 1} such that  j=1  y j = t.
                              That is, given that    n  Y j = t, each possible arrangement of 1s and 0s summing to t
                                                j=1
                            is equally likely.
                            Example 5.15 (Exponential random variables). Let Y 1 ,..., Y n denote independent, iden-
                            tically distributed random variables each distributed according to the absolutely continuous
                            distribution with density function

                                                       θ exp{−θy}, y > 0
                            where θ> 0. Then Y = (Y 1 ,..., Y n ) has model function
                                                      n

                                                                                   n
                                            n
                                           θ exp −θ     y j , y = (y 1 ,..., y n ) ∈ (0, ∞) ;
                                                     j=1
                            hence, this is a one-parameter exponential family of distributions with natural parameter
                                                           n
                            η =−θ, H = (−∞, 0), and T (y) =    y j .
                                                            j=1
                                                                                    n
                              Let a 1 ,..., a n denote real-valued, nonzero constants and let Z =  a j log Y j . Then
                                                                                     j=1
                            Z has moment-generating function
                                                   n                  n

                                                                           a j t


                                M Z (t; θ) = E exp t  a j log(Y j ) ; θ  =  E Y  j  ; θ
                                                  j=1                j=1
                                          n                n
                                              (a j t + 1)  j=1   (a j t + 1)
                                       =        a j t+1  =       n    , |t| < 1/ max(|a 1 |,..., |a n |).
                                               θ            n+t  j=1 a j
                                          j=1              θ
                                                                                       n
                            It follows that the distribution of Z does not depend on θ if and only if  j=1  a j = 0.
                                                                n                n
                              Hence, since H 0 = H,by Theorem 5.4,  j=1  a j log(Y j ) and  j=1  Y j are independent

                                         n
                            if and only if  j=1  a j = 0.
                              In applying the second part of Theorem 5.4 it is important that H 1 contains an open
                                     m
                            subset of R . Otherwise, the condition that the distribution of Z does not depend on θ ∈
                            is not strong enough to ensure that Z and T (Y) are independent. The following example
                            illustrates this possibility.
                            Example 5.16. Let Y 1 and Y 2 denote independent Poisson random variables such that Y 1 has
                            mean θ and Y 2 has mean 1 − θ, where 0 <θ < 1. The model function for the distribution
                            of Y = (Y 1 , Y 2 ) can then be written
                                                             1
                                     exp{log θy 1 + log(1 − θ)y 2 }  , y 1 = 0, 1,... ; y 2 = 0, 1,....
                                                            y 1 !y 2 !
                            Hence, c(θ) = (log θ, log(1 − θ)) and T (y) = (y 1 , y 2 ).
                              Let Z = Y 1 + Y 2 . Then, by Example 4.15, Z has a Poisson distribution with mean
                            θ + (1 − θ) = 1so that the distribution of Z does not depend on θ.However, Z and (Y 1 , Y 2 )
                            are clearly not independent; for instance, Cov(Z, Y 1 ; θ) = θ.
   155   156   157   158   159   160   161   162   163   164   165