Page 57 - Elements of Distribution Theory
P. 57

P1: JZP
            052184472Xc02  CUNY148/Severini  May 24, 2005  2:29





                                           2.2 Marginal Distributions and Independence        43

                          If X and Y are independent,

                            E[g 1 (X)g 2 (Y)] =  g 1 (x)g 2 (y) dF(x, y) =  g 1 (x)g 2 (y) dF X (x) dF Y (y)
                                           X×Y                     X×Y
                                        = E[g 1 (X)]E[g 2 (Y)].
                        Conversely, suppose that E[g 1 (X)g 2 (Y)] = E[g 1 (X)]E[g 2 (Y)] for all bounded, real-valued,
                        g 1 , g 2 . Let d 1 denote the dimension of X and let d 2 denote the dimension of Y. Then, for a
                        given x ∈ R and a given y ∈ R , let g 1 denote the indicator function for the set
                                                  d 2
                                  d 1
                                                                       ]
                                                (−∞, x 1 ] ×· · · × (−∞, x d 1
                        and let g 2 denote the indicator function for the set
                                                                      ];
                                                (−∞, y 1 ] ×· · · × (−∞, y d 2
                                                           ). Since E[g 1 (X)g 2 (Y)] = E[g 1 (X)]E[g 2 (Y)], it
                        here x = (x 1 ,..., x d 1  ) and y = (y 1 ,..., y d 2
                        follows that F(x, y) = F X (x)F Y (y). Since x and y are arbitrary, X and Y are independent;
                        this proves part (ii).

                          For the case in which the distribution of (X, Y)is either absolutely continuous or discrete,
                        it is straightforward to characterize independence in terms of either the density functions
                        or frequency functions of X and Y. The formal result is given in the following corollary to
                        Theorem 2.1; the proof is left as an exercise.

                        Corollary 2.1.
                           (i) Suppose (X, Y) has an absolutely continuous distribution with density function p
                              and let p X and p Y denote the marginal density functions of X and Y, respectively.
                              X and Y are independent if and only if

                                                      p(x, y) = p X (x) p Y (y)
                              for almost all x, y.
                          (ii) Suppose (X, Y) has a discrete distribution with frequency function p and let p X and
                              p Y denote the marginal frequency functions of X and

                                                      p(x, y) = p X (x)p Y (y)
                              for all x, y.

                        Example 2.4 (Bivariate distribution). Consider the distribution considered in Exam-
                        ple 2.1. The random vector (X, Y) has an absolutely continuous distribution with density
                        function
                                      p(x, y) = 6(1 − x − y),  x > 0,  y > 0,  x + y < 1
                        and the marginal density of X is
                                                             2
                                               p X (x) = 3(1 − x) ,  0 < x < 1;
                        the same argument used to derive p X may be used to show that the marginal density of Y
                        is also

                                                             2
                                               p Y (y) = 3(1 − y) ,  0 < y < 1.
                        Clearly, p  = p X p Y so that X and Y are not independent.
   52   53   54   55   56   57   58   59   60   61   62