Page 367 - Probability and Statistical Inference
P. 367

344    7. Point Estimation

                                                                   -1
                                                                      2
                                             2
                                 estimator of σ  coincides with (n - 1)n  S . Note, however, that      is
                                 sufficient for θθ θθ θ too. !
                                    Example 7.2.3 Suppose that X , ..., X  are iid N(0, σ ) where θ = σ  is
                                                                                  2
                                                                                               2
                                                                     n
                                                               1
                                                         χ
                                 unknown, 0 < σ < ∞. Here   = ℜ  and Θ = ℜ . Observe that η  ≡ η (θ) =
                                                              +
                                                                         +
                                                                                             1
                                                                                        1
                                 E [X ] = 0 for all θθ θθ θ. In this situation, it is clear that the equation given
                                  θ
                                     1
                                 by the first moment in (7.2.1) does not lead to anything interesting and
                                 one may arbitrarily move to use the second moment. Note that
                                                      so that (7.2.1) will now lead to
                                 After such ad hoc adjustment, the method of moment estimator turns out to
                                                 2
                                 be sufficient for σ . !
                                     The method of moments is an ad hoc way to find estimators. Also,
                                        this method may not lead to estimators which are functions of
                                       minimal sufficient statistics for θ. Look at Examples 7.2.4-7.2.5.
                                 If any of the theoretical moments η , η , ..., η  is zero, then one will continue
                                                               1  2    k
                                 to work with the first k non-zero theoretical moments so that (7.2.1) may
                                 lead to sensible solutions. Of course, there is a lot of arbitrariness in this
                                 approach.
                                    Example 7.2.4 Suppose that X , ..., X  are iid Poisson(λ) where θ = λ is
                                                              1
                                                                    n
                                 unknown, 0 < λ < ∞. Here  χ  = {0, 1, 2, ...} and Θ = ℜ . Now, η  ≡ η (θ) =
                                                                                +
                                                                                        1    1
                                 E [X ] = λ and                                Suppose that instead of
                                  θ  1
                                 starting with η  in (7.2.1), we start with η  and equate this with
                                                                    2
                                             1
                                 This then provides the equation
                                 which leads to the estimator                           However, if
                                 we had started with η , we would have ended up with the estimator   , a
                                                   1
                                 minimal sufficient statistic for η. The first estimator is not sufficient for η.
                                 From this example, one can feel the sense of arbitrariness built within this
                                 methodology. !
                                    Example 7.2.5 Suppose that X , ..., X  are iid Uniform (0, θ) where θ(> 0)
                                                              1
                                                                   n
                                 is the unknown parameter. Here η  ≡ η (θ) = E [X ] = ½θ so that by equating
                                                                           1
                                                                  1
                                                              1
                                                                        ?
                                 this with we obtain which is not sufficient for ?. Recall that X  is a minimal
                                                                                     n:n
                                 sufficient statistic for θ. !
                                 7.2.2   The Method of Maximum Likelihood
                                 The method of moments appeared quite simple but it was ad hoc and arbi-
                                 trary in its approach. In the Example 7.2.3 we saw that we could not equate
   362   363   364   365   366   367   368   369   370   371   372