Page 406 - Probability and Statistical Inference
P. 406
7. Point Estimation 383
parameter. Find an estimator of p by the method of moments.
7.2.3 Suppose that X , ..., X are iid distributed as Gamma(α, β) random
n
1
variables where α and β are both unknown parameters, 0 < α, β < ∞. Derive
estimators for α and β by the method of moments.
7.2.4 Suppose that X , ..., X are iid whose common pdf is given by
1 n
where θ(> 0) is the unknown parameter. Derive an estimator for θ by the
method of moments.
7.2.5 Suppose that X , ..., X are iid distributed as Beta(θ, θ) random vari-
1
n
ables where θ(> 0) is the unknown parameter. Derive an estimator for θ by
the method of moments.
7.2.6 (Exercise 7.2.2 Continued) Suppose that X , ..., X are iid distributed
1
n
as Geometric(p) random variables with the common pmf given by f(x; p) =
p(1 - p) , x = 0, 1, 2, ... , and 0 < p < 1 is the unknown parameter. Find the
x
MLE of p. Is the MLE sufficient for p?
7.2.7 Suppose that X , ..., X are iid Bernoulli(p) random variables where p
1
n
is the unknown parameter, 0 ≤ p ≤ 1.
(i) Show that is the MLE of p. Is the MLE sufficient for p?
(ii) Derive the MLE for p ;
2
(iii) Derive the MLE for p/q where q = 1 - p;
(iv) Derive the MLE for p(1 - p).
{Hint: In parts (ii)-(iv), use the invariance property of the MLE from Theo-
rem 7.2.1.}
7.2.8 (Exercise 7.2.7 Continued) Suppose that we have iid Bernoulli(p)
random variables X , ..., X where p is the unknown parameter, 0 < p < 1.
n
1
Show that is the MLE of p when is not zero or one. In the light of the
Example 7.2.10, discuss the situation one faces when is zero or one.
7.2.9 Suppose that X , ..., X are iid N(µ, σ ) where µ is known but σ is
2
2
n
1
2
unknown, θ = σ , −∞ < µ < ∞, 0 < σ < ∞, n ≤ 2.
2
(i) Show that the MLE of σ is ;
2
(ii) Is the MLE in part (i) sufficient for σ ?
(iii) Derive the MLE for 1/σ;
-1 ½
(iv) Derive the MLE for (σ + σ ) .
{Hint: In parts (iii)-(iv), use the invariance property of the MLE from Theo-
rem 7.2.1.}