Page 413 - Probability and Statistical Inference
P. 413

390    7. Point Estimation

                                    7.5.7 (Example 7.5.11 Continued) Let X , ..., X  be iid N(µ, σ ) where µ, σ
                                                                                       2
                                                                           n
                                                                     1
                                 are both unknown with µ ∈ ℜ, σ ∈ ℜ , n ≥ 2. Let θθ θθ θ = (µ σ) and T(θθ θθ θ) = µσ k
                                                                  +
                                 where k is a known and fixed real number. Derive the UMVUE for T(θθ θθ θ). Pay
                                 particular attention to any required minimum sample size n which may be
                                 needed. {Hint: Use (2.3.26) and the independence between     and S to first
                                                          k
                                                                  2
                                 derive the expectation of   S  where S  is the sample variance. Then make
                                 some final adjustments.}
                                    7.5.8 (Example 7.5.12 Continued) Let X , ..., X  be iid having the common
                                                                     1
                                                                           n
                                 pdf σ exp{−(x - µ)/σ}I(x > µ) where µ, σ are both unknown with −∞ < µ <
                                      -1
                                 ∞, 0 < σ < ∞, n ≥ 2. Let θθ θθ θ = (µ, σ) and T(θθ θθ θ) = µσ  where k is a known and
                                                                            k
                                 fixed real number. Derive the UMVUE for T(θθ θθ θ). Pay particular attention to any
                                 required minimum sample size n which may be needed. {Hint: Use (2.3.26)
                                 and the independence between X  and                    to first derive
                                                             n:1
                                 the expectation of  Y . Then make some final adjustments.}
                                                    k
                                    7.5.9 Suppose that X , ..., X  are iid Uniform (−θ, θ) where θ is the un-
                                                      1     n
                                                                 k
                                                      +
                                 known parameter, θ ∈ ℜ . Let T(θ) = θ  where k is a known and fixed positive
                                 real number. Derive the UMVUE for  T(θ). {Hint: Verify that U = |X | is
                                                                                             n:n
                                 complete sufficient for θ. Find the pdf of U/θ to first derive the expectation of
                                 U . Then make some final adjustments.}
                                   k
                                    7.5.10 In each Example 7.4.1-7.4.8, argue that the Rao-Blackwellized es-
                                 timator W is indeed the UMVUE for the respective parametric function. In the
                                 single parameter problems, verify whether the variance of the UMVUE attains
                                 the corresponding CRLB.
                                    7.5.11 Suppose that X , ..., X , X  are iid N(µ, θ ) where µ is unknown
                                                                              2
                                                             n
                                                                n+1
                                                       1
                                 but σ is assumed known with µ ∈ ℜ, σ ∈ ℜ , n ≥ 2. Consider the parametric
                                                                      +
                                 function                          The problem is to find the UMVUE
                                 for T(µ). Start with                 which is an unbiased estimator for
                                 T(θ). Now, proceed along the following steps.
                                    (i)   Note that           is complete sufficient for µ and that T
                                          is an unbiased estimator of T(θ);
                                    (ii)  Observe that W = E {T |U = u} = P {T = 1| U = u}, and find
                                                          µ
                                                                        µ
                                          the expression for W. {Hint: Write down explicitly the bivari
                                          ate normal distribution of              and U. Then,
                                          find the conditional probability, P {T  > 0| U = u} utilizing the
                                                                        1
                                                                      µ
                                          Theorem 3.6.1. Next, argue that T  > 0 holds if and only if T
                                                                      1
                                          = 1.}.
                                    7.5.12 Suppose that X , ..., X  are iid Bernoulli(p) where 0 < p < 1 is an
                                                             n
                                                       1
                                                                                           2
                                 unknown parameter. Consider the parametric function T(p) = p + qe  with q
                                 = 1 − p.
   408   409   410   411   412   413   414   415   416   417   418