Page 342 - Probability and Statistical Inference
P. 342

6. Sufficiency, Completeness, and Ancillarity  319

                           holds. Consider any real valued function h(t) defined for t ∈ T, having finite
                           expectation, such that



                           In other words,



                              Definition 6.6.3 A statistic T is said to be complete if and only if the
                           family of distributions {g(t; θ): θ ∈ Θ} induced by T is complete.
                              In these definitions, observe that neither the statistic T nor the parameter θ
                           has to be necessarily real valued. If we have, for example, a vector valued
                           statistic T = (T , T , ..., T ) and θ is also vector valued, then the requirement
                                          2
                                       1
                                                k
                           in (6.6.1) would obviously be replaced by the following:



                           Here, h(t) is a real valued function of t = (t , ..., t ) ∈ T and g(t; θ) is the pmf
                                                               1
                                                                    k
                           or the pdf of the statistic T at the point t.
                              The concept of completeness was introduced by Lehmann and Scheffé
                           (1950) and further explored measure-theoretically by Bahadur (1957). Next
                           we give two examples. More examples will be given in the subsection
                           6.6.1.
                              Example 6.6.1 Suppose that a statistic T is distributed as Bernoulli(p), 0
                           < p < 1. Let us examine if T is a complete statistic according to the Defini-
                           tion 6.6.3. The pmf induced by T is given by g(t; p) = p (1 - p) , t = 0, 1.
                                                                                  1-t
                                                                            t
                           Consider any real valued function h(t) such that E [h(T)] = 0 for all 0 < p <
                                                                      p
                           1. Now, let us focus on the possible values of h(t) when t = 0, 1, and write




                           Observe that the middle step in (6.6.3) is linear in p and so it may be zero
                           for exactly one value of p between zero and unity. But we are demanding
                           that p{h(1) - h(0)} + h(0) must be zero for infinitely many values of p in
                           (0, 1). Hence, this expression must be identically equal to zero which
                           means that the constant term as well as the coefficient of p must indi-
                           vidually be both zero. That is, we must have h(0) = 0 and h(1) - h(0) =
                           0 so that h(1) = 0. In other words, we have h(t) = 0 for t = 0, 1. Thus,
                           T is a complete statistic. !
   337   338   339   340   341   342   343   344   345   346   347