Page 382 - Probability and Statistical Inference
P. 382

7. Point Estimation  359

                              Proof (i) Since U is sufficient for θθ θθ θ, the conditional distribution of T given
                           U = u can not depend upon the unknown parameter θθ θθ θ, and this remains true
                           for all u ∈ u. This clearly follows from the Definition 6.2.3 of sufficiency.
                           Hence, g(u) is a function of u and it is free from θθ θθ θ for all u ∈ U. In other
                           words, W = g(U) is indeed a real valued statistic and so we can call it an
                           estimator. Using the Theorem 3.3.1, part (i), we can write  E[X] =
                           E [E(X | Y)] where X and Y are any two random variables with finite expec-
                            Y
                           tations. Hence we have for all θθ θθ θ ∈ Θ,




                           which shows that W is an unbiased estimator of  T(θθ θθ θ). !
                              (ii) Let us now proceed as follows for all θθ θθ θ ∈ Θ:












                           since we have








                           Now, from (7.4.2), the first conclusion in part (ii) is obvious since {T - W} 2
                                                              2
                           is non-negative w.p.1 and thus E [{T - W} ] ≥ 0 for all θθ θθ θ ∈ Θ. For the second
                                                      θ
                           conclusion in part (ii), notice again from (7.4.2) that V [W] = V [T] for all θθ θθ θ ∈
                                                                               θ
                                                                        θ
                           Θ if and only if E [{T - W} ] = 0 for all θθ θθ θ ∈ Θ, that is if and only if T is the
                                                  2
                                          θ
                           same as W w.p.1. The proof is complete.!
                              One attractive feature of the Rao-Blackwell Theorem is that there is no
                           need to guess the functional form of the final unbiased estimator of  T(θθ θθ θ).
                           Sometimes guessing the form of the final unbiased estimator of  T(θθ θθ θ) may be
                           hard to do particularly when estimating some unusual parametric function.
                           One will see such illustrations in the Examples 7.4.5 and 7.4.7.
                              Example 7.4.1 Suppose that X , ..., X  are iid Bernoulli(p) where 0 <
                                                               n
                                                         1
                           p < 1 is unknown. We wish to estimate  T(p) = p unbiasedly. Consider T =
                           X  which is an unbiased estimator of p. We were discussing this example
                            1
   377   378   379   380   381   382   383   384   385   386   387