Page 359 - Probability and Statistical Inference
P. 359
6. Sufficiency, Completeness, and Ancillarity 336
the lines of the Example 6.5.8. Next, along the lines of the Example 6.5.11,
recover the lost information in X by means of conditioning on the ancillary
statistic Y.
6.5.15 (Example 6.5.8 Continued) Suppose that (X, Y) is distributed as
N (0, 0, σ , σ , ρ) where θ = (σ , ρ) with the unknown parameters σ , ρ
2
2
2
2
2
where 0 < σ < ∞, 1 < ρ < 1. Evaluate the expression for the information
matrix I (θ). {Hint: Does working with U = X + Y, V = X - Y help in the
X,Y
derivation?}
6.5.16 Suppose that X′ = (X , ..., X ) where X is distributed as multivariate
p
1
2
normal N (0, σ [(1 ρ)I + ρ11′]) with 1′ = (1, 1, ....1), σ ∈ ℜ and (p
+
p
p×p
1) < ρ < 1. We assume that θ = (σ , ρ) where σ , ρ are the unknown
1
2
2
parameters, 0 < σ < ∞, 1 < ρ < 1. Evaluate the expression for the informa-
tion matrix I (θ). {Hint: Try the Helmert transformation from the Example
X
2.4.9 on X to generate p independent normal variables each with zero mean,
and variances depending on both σ and ρ, while (p - 1) of these variances are
2
all equal but different from the p one. Is it then possible to use the Theorem
th
6.4.3?}
6.5.17 (Example 6.5.8 Continued) Suppose that (X, Y) is distributed as
N (0, 0, 1, 1, ρ) where ρ is the unknown parameter, 1 < ρ < 1. Start with the
2
pdf of (X, Y) and then directly apply the equivalent formula from the equation
(6.4.9) for the evaluation of the expression of I (ρ).
X,Y
6.6.1 Suppose that X , X are iid Poisson(λ) where λ(> 0) is the unknown
1
2
parameter. Consider the family of distributions induced by the statistic T =
(X , X ). Is this family, indexed by λ, complete?
1 2
6.6.2 (Exercise 6.3.5 Continued) Let X , ..., X be iid Geometric(p), that is
1
n
the common pmf is given by f(x; p) = p(1 - p) , x = 0, 1, 2, ... where 0 < p <
x
1 is the unknown parameter. Is the statistic complete sufficient for p?
{Hint: Is it possible to use the Theorem 6.6.2 here?}
2
6.6.3 Let X , ..., X be iid N(θ, θ ) where 0 < θ < ∞ is the unknown
n
1
parameter. Is the minimal sufficient statistic complete? Show that and S 2
are independent. {Hint: Can the Example 4.4.9 be used here to solve the
second part?}
6.6.4 Let X , ..., X be iid N(θ, θ) where 0 < θ < ∞ is the unknown param-
n
1
2
eter. Is the minimal sufficient statistic complete? Show that and S are
independent. {Hint: Can the Example 4.4.9 be used here to solve the second
part?}
6.6.5 Let X , ..., X be iid having a negative exponential distribution
1
n
with the common pdf θ exp{(x θ)/θ}I(x > θ) where 0 < θ < ∞ is the
1
unknown parameter. Is the minimal sufficient statistic complete? Show