Page 405 - Probability and Statistical Inference
P. 405

382    7. Point Estimation

                                    It has now become a common place for some authors of Bayesian papers
                                 to claim that a proposed “Bayes estimator” or “Bayes decision” also satisfies
                                 the “consistency” property in some sense. In the Bayesian literature, some
                                 authors refer to this as the Bayesian-Frequentist compromise. Will it then be
                                 fair to say that the (Fisherian) consistency property has not been such a bad
                                 concept after all? The “history” does have an uncanny ability to correct its
                                 own course from time to time! At the very least, it certainly feels that way.
                                    A simple truth is that R. A. Fisher never suggested that one should choose
                                 an estimator because of its consistency property alone. The fact is that in
                                 many “standard” problems, some of the usual estimators such as the MLE,
                                 UMVUE or estimators obtained via method of moments are frequently con-
                                 sistent for the parameter(s) of interest.
                                    The common sense dictates that the consistency property has to take a
                                 back seat when considered in conjunction with the sufficiency property. The
                                 estimator T  defined in (7.7.1) is not sufficient for θ for any reasonable fixed-
                                          n
                                 sample-size  n when  k = 10 . On the other hand, the sample mean
                                                            6
                                                 is sufficient for θ and it also happens to be consistent for θ.
                                    We should add that, in general, an MLE     may not be consistent for ?.
                                 Some examples of inconsistent MLE’s were constructed by Neyman and
                                 Scott (1948) and Basu (1955b). In Chapter 12, under mild regularity condi-

                                 tions, we quote a result in (12.2.3) showing that an MLE  is indeed consis-
                                 tent for the parameter θ it is supposed to estimate in the first place. This
                                 result, in conjunction with the invariance property (Theorem 7.2.1), make the
                                 MLE’s very appealing in practice.



                                 7.8   Exercises and Complements

                                                                                                2
                                    7.2.1 (Example 7.2.3 Continued) Suppose that X , ..., X  are iid N(0, σ )
                                           2
                                                                                   n
                                                                             1
                                 where q = σ  is unknown, 0 < s < ¥. Here we have χ = Θ = ℜ and h  ≡ h (q)
                                                                                               1
                                                                                           1
                                 = E [X ] = 0,                      for all q. It is clear that the equations
                                    q
                                       1
                                 given by the first and third moments in (7.2.1) do not lead to anything inter-
                                 esting. In the Example 7.2.3 we had used the expression of η (θ). Now we
                                                                                      2
                                 may  arbitrarily move to the fourth moment and write
                                                                                                2
                                                           Using η , find an appropriate estimator of σ .
                                                                 4
                                 How is this estimator different from the one obtained in the Example 7.2.3?
                                    7.2.2 (Exercise 6.3.5 Continued) Suppose that X , ..., X  are iid dis-
                                                                                1
                                                                                      n
                                 tributed as Geometric(p) random variables with the common pmf given
                                                    x
                                 by f(x; p) = p(1 - p) , x = 0, 1, 2, ... , and 0 < p < 1 is the unknown
   400   401   402   403   404   405   406   407   408   409   410