Page 350 - Probability and Statistical Inference
P. 350
6. Sufficiency, Completeness, and Ancillarity 327
In the Examples 6.6.12-6.6.15, the arguments revolved around
sufficiency and completeness to come face to face with the result
that and S are independent. A reader may get the wrong
2
impression that the completeness property is essential to claim
that and S are independent. But, note that ( , S ) is not a
2
2
complete statistic when we have random samples from N(θ, θ)
or N(θ, θ ) population, with θ > 0. Yet it is true that and S 2
2
are independent in such situations. Refer to the Example 4.4.9.
Example 6.6.17 (Example 6.6.15 Continued) Suppose that X , ..., X are
n
1
iid N(µ, σ ) with (µ, σ) ∈ ℜ × ℜ where µ and σ are both assumed unknown.
2
+
By Basus Theorem, one can immediately claim that the statistic ( , S ) and
2
(X )/(X X ) are independent. !
1 n:n n:1
Example 6.6.18 (Example 6.6.11 Continued) Suppose that X , ..., X are
1
n
iid Uniform(0, θ) with n ≥ 2, θ(> 0) being the unknown parameter. We know
that U = X is a complete sufficient statistic for θ. Let W = X /X which is
n:1
n:n
n:n
ancillary for θ. Hence, by Basus Theorem, X and X /X are indepen-
n:n
n:1
n:n
dently distributed. Also, X and /S are independent, since /S is ancillary
n:n
2
for θ where S stands for the sample variance. Using a similar argument, one
can also claim that (X X /S and X are independent. One may look at the
n:1
n:n
n:n
scale family of distributions to verify the ancillarity property of the appropri-
ate statistics. !
Remark 6.6.1 We add that a kind of the converse of Basus Theorem was
proved later in Basu (1958). Further details are omitted for brevity.
6.7 Exercises and Complements
6.2.1 Suppose that X , X are iid Geometric(p), that is the common pmf is
2
1
given by f(x;p) = p(1 p) , x = 0, 1, 2, ... where 0 < p < 1 is the unknown
x
parameter. By means of the conditional distribution approach, show that
X + X is sufficient for p.
1 2
6.2.2 Suppose that X , ..., X are iid Bernoulli(p), Y , ...., Y are iid
m
n
1
1
Bernoulli(q), and that the Xs are independent of the Ys where 0 < p < 1 is
the unknown parameter with q = 1 - p. By means of the conditional distribu-
tion approach, show that is sufficient for p. {Hint: In-
stead of looking at the data (X , ..., X , Y , ...., Y ), can one justify looking
n
1
1
m
at (X , ..., X , 1 - Y , ...., 1 - Y )?}
1 m 1 n