Page 344 - Probability and Statistical Inference
P. 344
6. Sufficiency, Completeness, and Ancillarity 321
Example 6.6.4 Let X , ..., X be iid Poisson(λ) where λ(> 0) is the un-
1
n
known parameter. We know that is a sufficient statistic for λ.
Let us verify that T is also complete. Since T is distributed as Poisson(nλ), the
pmf induced by T is given by g(t; λ) = e (nλ) /t!, t ∈ T = {0, 1, 2, ...}.
nλ
t
Consider any real valued function h(t) such that E [h(T)] = 0 for all 0 < λ < ∞.
λ
Now, let us focus on the possible values of h(t) for t = 0, 1, 2, ... . With k(t)
= h(t) n /t!, we can write
t
From (6.6.5) we observe that E [h(T)] has been expressed as an infinite power
λ
series in the variable λ belonging to (0, ∞). The collection of such infinite
power series forms a vector space generated by the set = {1, λ, λ , λ , ...,
3
2
λ , ...}. Also, happens to be the smallest generator in the sense that if any
t
element is dropped from , then will be unable to generate all infinite power
series in λ. In other words, the vectors belonging to are linearly indepen-
dent. So, if we are forced to assume that
we will then conclude that that is k(t) = 0 for all t = 0, 1, 2,
... . Hence, we conclude that h(t) ≡ 0 for all t = 0, 1, 2, ... , proving the
completeness of the sufficient statistic T. !
In general, a sufficient or minimal sufficient statistic T for
an unknown parameter θ is not necessarily complete.
Look at the Examples 6.6.5-6.6.6.
Example 6.6.5 Let X , ..., X be iid Normal(θ, θ) where θ(> 0) is the
1
n
2
unknown parameter. One should verify that T = ( , S ) is a minimal suffi-
cient statistic for θ, but note that E ( ) = θ and E ( , S ) = θ for all θ > 0.
2
θ
θ
2
That is, for all θ > 0, we have E ( S ) = 0. Consider h(T) = S and
2
θ
then we have E (h(T)) ≡ 0 for all θ > 0, but h(t) = s is not identically zero
2
θ
+
for all t ∈ ℜ × ℜ . Hence, the minimal sufficient statistic T can not be com-
plete. !
Example 6.6.6 Let X , ..., X be iid with the common pdf given by θ -1
n
1
exp{ (x θ)/θ}I(x > θ) where θ(> 0) is the unknown parameter. We note that
the likelihood function L(θ) is given by
and so it follows that U = is a minimal sufficient statistic for
θ. But, the statistic T = is a one-to-one function of