Page 324 - Probability and Statistical Inference
P. 324
6. Sufficiency, Completeness, and Ancillarity 301
situations separately. The notion of the information about an unknown param-
eter θ contained in the data was introduced by F. Y. Edgeworth in a series of
papers, published in the J. Roy. Statist. Soc., during 1908-1909. Fisher (1922)
articulated the systematic development of this concept. The reader is referred
to Efrons (1998, p.101) recent commentaries on (Fisher) information.
6.4.1 One-parameter Situation
Suppose that X is an observable real valued random variable with the pmf or
pdf f (x; θ) where the unknown parameter θ ∈ Θ, an open subinterval of ℜ,
while the χ space is assumed not to depend upon θ. We assume throughout
that the partial derivative (x; θ) is finite for all x ∈ χ, θ ∈ Θ. We also
assume that we can interchange the derivative (with respect to θ) and the
integral (with respect to x).
Definition 6.4.1 The Fisher information or simply the information about
θ, contained in the data, is given by
Example 6.4.1 Let X be Poisson(λ), λ > 0. Now,
which implies that . Thus, we have
since E [(X λ) ] = V(X) = λ. That is, as we contemplate having larger and
2
λ
larger values of λ, the variability built in X increases, and hence it seems
natural that the information about the unknown parameter λ contained in the
data X will go down further and further.
Example 6.4.2 Let X be N(µ, σ ) where µ ∈ (∞, ∞) is the unknown pa-
2
rameter. Here, σ ∈ (0, ∞) is assumed known. Now,
which implies that . Thus we have