Page 376 - Probability and Statistical Inference
P. 376

7. Point Estimation  353

                              Proof Let us write E (T) = ξ(θ). Then, we have
                                               θ












                           Now, since ξ(θ) - T(θ) is a fixed real number, we can write E [{T - ξ(θ)}{ξ(θ)
                                                                             θ
                           -  T(θ)}] = {ξ(θ) -  T(θ)} E [{T - θ(θ)}] = 0. Hence the result follows. !
                                                θ
                              Now, we can evaluate the MSE  as V (T ) + [E (T ) - µ]  = 2σ  + (2µ -
                                                                              2
                                                                                    2
                                                         T 1  µ  1     µ  1
                           µ)  = µ  + 2σ . The evaluation of MSE  is left as the Exercise 7.3.6.
                             2
                                      2
                                 2
                                                            T 4
                                It is possible sometimes to have T  and T  which are respectively
                                                            1
                                                                  2
                                 biased and unbiased estimators of  T(θ), but MSE  < MSE . In
                                                                          T 1    T 4
                               other words, intuitively a biased estimator may be preferable if its
                                  average squared error is smaller. Look at the Example 7.3.1.
                              Example 7.3.1 Let X , ..., X  be iid N(µ, σ ) where µ, σ  are both un-
                                                                               2
                                                                   2
                                                      n
                                                1
                                                                           χ
                           known, θ = (µ, σ ), −∞ < µ < ∞, 0 < σ < ∞, n ≥ 2. Here   = ℜ and Θ = ℜ ×
                                         2
                                                                                2
                           ℜ . Our goal is the estimation of a parametric function  T(θ) = σ , the popula-
                            +
                           tion variance S =                       Consider the customary sample
                                       2
                                                              2
                                                                                          2
                           variance We know that S  is unbiased for σ . One will also recall that (n - 1)S /
                                               2
                           σ  is distributed     as and hence V (S ) = 2σ (n - 1) . Next, consider an-
                            2
                                                             2
                                                                         -1
                                                                   4
                                                           θ
                                             2
                           other estimator for σ , namely T = (n +               which can be
                                                                                     2
                                                    2
                           rewritten as (n - 1)(n + 1)  S . Thus, E (T) = (n - 1)(n + 1)  σ  ≠ σ  and so
                                                                             -1
                                                                                2
                                                 -1
                                                            θ
                           T is a biased estimator of σ . Next, we evaluate
                                                  2
                           Then we apply the Theorem 7.3.1 to express MSE  as
                                                                      T
                           which is smaller than V (S ). That is, S  is unbiased for σ  and T is biased for
                                                                           2
                                                           2
                                                 2
                                               θ
                           σ  but MSE  is smaller than MSE 2 for all θ. Refer to the Exercise 7.3.2 to see
                            2
                                     T                 s
                                                                        2
                           how one comes up with an estimator such as T for σ . !
                              In the context of the example we had been discussing earlier in this
                           section, suppose that we consider two other estimators T  = ½X  and
                                                                                7      1
   371   372   373   374   375   376   377   378   379   380   381