Page 56 - Elements of Distribution Theory
P. 56

P1: JZP
            052184472Xc02  CUNY148/Severini  May 24, 2005  2:29





                            42                  Conditional Distributions and Expectation

                            This is an absolutely continuous distribution with density function
                                                     1
                                                      [exp(−x) + 2exp(−2x)].
                                                     2
                              Since, for y = 1, 2,

                                                          1     ∞             1
                                               Pr(Y = y) =      y exp(−yx) dx =  ,
                                                          2  0                2
                            it follows that Y has a discrete distribution with frequency function 1/2, y = 1, 2.

                            Independence
                            Consider a random vector (X, Y) with range X × Y.Wesay X and Y are independent if for
                            any A ⊂ X and B ⊂ Y, the events X ∈ A and Y ∈ B are independent events in the usual
                            sense of elementary probability theory; that is, if
                                              Pr(X ∈ A, Y ∈ B) = Pr(X ∈ A)Pr(Y ∈ B).

                            Independence may easily be characterized in terms of either distribution functions or
                            expected values.


                            Theorem 2.1. Let (X, Y) denote a random vector with range X × Y and distribution func-
                            tion F. Let F X and F Y denote the marginal distribution functions of X and Y, respectively.
                               (i) X and Y are independent if and only if for all x, y
                                                         F(x, y) = F X (x)F Y (y).

                              (ii) X and Y are independent if and only if for all bounded functions g 1 : X → R and
                                  g 2 : Y → R

                                                    E[g 1 (X)g 2 (Y)] = E[g 1 (X)]E[g 2 (Y)].

                            Proof. Suppose X and Y are independent. Let m denote the dimension of X and let n
                            denote the dimension of Y. Fix x = (x 1 ,..., x m ) and y = (y 1 ,..., y n ); let

                                                  A = (−∞, x 1 ] × ··· × (−∞, x m ]
                            and

                                                  B = (−∞, y 1 ] × ··· × (−∞, y n ]
                            so that

                               F(x, y) = Pr(X ∈ A, Y ∈ B), F X (x) = Pr(X ∈ A), and F Y (y) = Pr(Y ∈ B).

                            Then
                                    F(x, y) = Pr(X ∈ A, Y ∈ B) = Pr(X ∈ A)Pr(Y ∈ B) = F X (x)F Y (y).

                              Now suppose F(x, y) = F X (x)F Y (y). Since F X (x)F Y (y)is the distribution function of
                            a random variable (X 1 , Y 1 ) such that X 1 and Y 1 are independent with marginal distribution
                            functions F X and F Y ,respectively,itfollowsthat(X, Y)hasthesamedistributionas(X 1 , Y 1 );
                            that is, X and Y are independent. This proves part (i).
   51   52   53   54   55   56   57   58   59   60   61