Page 358 - Probability and Statistical Inference
P. 358
6. Sufficiency, Completeness, and Ancillarity 335
6.5.6 (Exercise 6.3.12 Continued) Let X , ..., X be iid having the common
n
1
Uniform distribution on the interval (θ, θ) where 0 < θ < ∞ is the unknown
parameter. Show that X /X is an ancillary statistic for θ. Also show that
n:1
n:n
X /(X X ) is an ancillary statistic for θ.
n:n n:n n:1
6.5.7 (Curved Exponential Family) Suppose that (X, Y) has a particular
curved exponential family of distributions with the joint pdf given by
where θ(> 0) is the unknown parameter. This distribution was discussed by
Fisher (1934, 1956) in the context of the famous Nile example. Denote U =
XY, V = X/Y. Show that U is ancillary for θ, but (U, V) is minimal sufficient
for θ, whereas V by itself is not sufficient for θ. {Hint: The answers to the
Exercise 4.4.10 would help in this problem.}
6.5.8 Show that the families F , F , F defined via (6.5.7) include only
2
3
1
genuine pdfs.
6.5.9 (Exercise 6.3.8 Continued) Let X , ..., X be iid N(θ, θ ) where 0 < θ
2
n
1
< ∞ is the unknown parameter. Construct several ancillary statistics for θ.
Does the common pdf belong to one of the special families defined via (6.5.7)?
6.5.10 (Exercise 6.3.9 Continued) Let X , ..., X be iid N(θ, θ) where 0 <
1
n
θ < 8 is the unknown parameter. Construct several ancillary statistics for θ.
Does the common pdf belong to one of the special families defined via (6.5.7)?
6.5.11 (Exercise 6.3.10 Continued) Let X , ..., X be iid having the com-
n
1
mon pdf θ exp{ (x θ)/θ}I(x > θ) where 0 < θ < 8 is the unknown
1
parameter. Construct several ancillary statistics for θ. Does the common pdf
belong to one of the special families defined via (6.5.7)?
6.5.12 (Exercise 6.3.11 Continued) Let X , ..., X be iid having the com-
n
1
mon pdf θ exp{ (x θ)/θ }I(x > θ) where 0 < θ < ∞ is the unknown
2
2
parameter. Construct several ancillary statistics for θ. Does the common pdf
belong to one of the special families defined via (6.5.7)?
6.5.13 Use the Definition 6.4.1 of the information and the pdf from (6.5.4)
to derive directly the expression for I (ρ). {Hint: Take the log of the pdf
X,Y
from (6.5.4). Then take the partial derivative of this with respect to ?. Next,
square this partial derivative and take its expectation.}
2
6.5.14 Suppose that (X, Y) is distributed as N (0, 0, σ , σ , ρ) where
2
2
-1 < ρ < 1 is the unknown parameter while 0 < σ (≠ 1) < ∞ is assumed
known. Evaluate the expression for the information matrix I (ρ) along
X,Y