Page 262 - Schaum's Outlines - Probability, Random Variables And Random Processes
P. 262

CHAP.  73                        ESTIMATION  THEORY




              Thus,

              In order to find the values of ,u and a maximizing the above, we compute







              Equating these equations to zero, we get







              Solving for jML and &ML, the maximum-likelihood estimates of p and a2 are given, respectively, by







              Hence, the maximum-likelihood estimators of p  and a2 are given, respectively, by








        BAYES'  ESTIMATION
        7.11.  Let (XI, . . . , X,) be the random sample of a Bernoulli r.v. X with pmf given by [Eq. (2.32)]
                                       f (x; P) = PX(l - P)'  -"   x=o, 1                (7.43)
              where p,  0 < p I is unknown.  Assume  that  p  is  a  uniform  r.v.  over  (0, 1).  Find  the  Bayes'
                             1,
              estimator of p.
                 The prior pdf of p is the uniform pdf; that is,


              The posterior pdf of p is given by



              Then, by Eq. (7.1 2),




              where m = z;=, xi, and by Eq. (7.1 3),



              Now, from calculus, for integers m and k, we have
                                                           m! k!
                                          [pm(l  -  dp = --
                                                        (m + k  $- I)!
   257   258   259   260   261   262   263   264   265   266   267