Page 409 - Probability and Statistical Inference
P. 409
386 7. Point Estimation
(i) Find the MSE of V. Then, minimize this MSE with respect to c.
Call this latter estimator W which has the smallest MSE among
the estimators of σ which are multiples of U;
2
(ii) Show that estimator W coincides with
which was used in the Example 7.3.1.
7.3.3 (Exercise 7.2.13 Continued) Let X , ..., X be iid having the common
1
n
pdf σ exp{−(x − µ)/σ}I(x > µ) where µ, σ are both unknown, −∞ < µ < ∞ 0
-1
< σ < ∞, n ≥ 2. Denote Let V = cU be an estimator of
σ where c(> 0) is a constant.
(i) Find the MSE of V. Then, minimize this MSE with respect to c.
Call this latter estimator W which has the smallest MSE among
the estimators of s which are the multiples of U;
(ii) Hdow do the two estimators W and
compare relative to their respective bias and MSE?
7.3.4 Suppose that X , ..., X are iid from the following respective
1
n
populations. Find the expressions for the BLUE of θ, the parameter of
interest in each case.
(i) The population is Poisson(λ) where θ = λ ∈ ℜ ;
+
(ii) The population is Binomial(n,p) where θ = p ∈ (0, 1);
(iii) The population has the pdf f(x) = exp(- |x|/σ), x ∈ ℜ where
θ = σ ∈ ℜ .
+
7.3.5 (Exercise 7.3.1 Continued) Suppose that X , ..., X are iid N(0, σ )
2
n
1
where 0 < σ < ∞ is the unknown parameter. Consider estimating θ unbiasedly
by linear functions of |X |, i = 1, ..., n. Within this class of estimators, find the
i
expression of the BLUE of σ. Next, evaluate the variance of the BLUE of σ.
7.3.6 Look at the estimator T defined in (7.3.1). Evaluate its MSE.
4
7.3.7 (Example 7.3.2 Continued) Suppose that we have iid Bernoulli(p)
random variables X , ..., X where 0 < p < 1 is an unknown parameter. Show
n
1
that there is no unbiased estimator for the parametric function (i) T(p) = p (1
-1
-1
− p) , (ii) T(p) = p/(1 - p).
7.3.8 (Example 7.3.3 Continued) Suppose that we have iid Bernoulli(p)
random variables X , X , ... where 0 < p < 1 is an unknown parameter. Con-
1
2
sider the parametric function T(p) = p and the observable random variable N
-2
defined in the Example 7.3.3. Use the expressions for the mean and variance
of the Geometric distribution to find an estimator T involving N so that T is
unbiased for T(p).
7.4.1 Suppose that X , ..., X are iid Bernoulli(p) where 0 < p < 1 is
n
1
an unknown parameter with n ≥ 2. Consider the parametric function