Page 327 - Probability and Statistical Inference
P. 327
304 6. Sufficiency, Completeness, and Ancillarity
Example 6.4.4 (Example 6.4.2 Continued) Let X , ..., X be iid N(µ, σ )
2
1
n
where µ ∈ ( ∞, ∞) is the unknown parameter. Here σ ∈ (0, ∞) is assumed
known. We had shown earlier that the statistic T = was sufficient for µ.
Let us now pursue T from the information point of view. The statistic T is
distributed as N(µ, n σ ) so that one can start with the pdf g(t; µ) of T and
2
-1
verify that
as follows: Let us write log{g(t; µ)} = ½{n(t µ) /σ }
2
2
2
which implies that = n(t µ)/σ . Hence, we have I (µ)
T
= E [n (T µ) /σ ] = nσ since E [(T µ) ] =
2
2
4
2
-2
µ
µ
-1
2
V = n σ . From the Example 6.4.2, however, we know that the informa-
-2
tion contained in one single observation is I (µ) = nσ and thus in view of
X1
-2
(6.4.4), we have I (µ) = nI (µ) = nσ . That is, T preserves the available
X1
X
information from the whole data X. Now, the Theorem 6.4.2 would imply
that the statistic T is indeed sufficient for λ. !
Remark 6.4.1 Suppose that the pmf or pdf f(x; θ) is such that
is finite for all x ∈ χ and E is finite for all θ ∈ Θ. Then the Fisher
θ
information defined earlier can be alternatively evaluated using the following
expression:
We leave its verification as an exercise.
Example 6.4.5 (Example 6.4.1 Continued) Use (6.4.9) and observe that
log f(x; λ) = xλ so that I (λ) = E = E [-Xλ ] =
-2
-2
X λ λ
-1
λ .!
Example 6.4.6 (Example 6.4.2 Continued) Use (6.4.9) and observe that
2
2
2
log f(x; µ) = σ so that I (λ) = E [ log f(X; µ)] = E [σ ] = σ . !
X λ λ
In the Exercise 6.4.16, we pursue an idea like this: Suppose
that a statistic T is not sufficient for θ. Can we say something
about how non-sufficient T is for θ?
6.4.2 Multi-parameter Situation
When the unknown parameter θ is multidimensional, the definition of the
Fisher information gets more involved. To keep the presentation simple,