Page 43 - Elements of Distribution Theory
P. 43

P1: JZP
            052184472Xc01  CUNY148/Severini  May 24, 2005  17:52





                                                      1.8 Expectation                         29

                        Example 1.29 (Cauchy distribution). Let X denote a real-valued random variable with an
                        absolutely continuous distribution with density function
                                                      1
                                            p(x) =         ,  −∞ < x < ∞;
                                                         2
                                                   π(1 + x )
                        this is a standard Cauchy distribution.IfE(X)exists, it must be equal to
                                          x                 x                x

                                    ∞                 ∞                 ∞
                                               dx =              dx −             dx.
                                             2
                                                                                2
                                                               2
                                   −∞ π(1 + x )      0  π(1 + x )      0  π(1 + x )
                        Since
                                                         x
                                                    ∞

                                                              dx =∞,
                                                            2
                                                      π(1 + x )
                                                   0
                        it follows that E(X) does not exist.
                          Now suppose that X is a random vector X = (X 1 ,..., X d ), where X j , j = 1,..., d,is
                        real-valued. Then E(X)is simply the vector (E(X 1 ),..., E(X d )).
                        Example 1.30 (Uniform distribution on the unit cube). Let X = (X 1 , X 2 , X 3 ) denote a
                                                                                3
                        three-dimensional random vector with the uniform distribution on (0, 1) ; see Examples 1.7
                        and 1.25. Then X has an absolutely continuous distribution with density function
                                                                    3
                                                  p(x) = 1,  x ∈ (0, 1) .
                        It follows that
                                                                    1
                                                             1
                                                  E(X 1 ) =  xdx =   .
                                                           0        2
                        Similarly, E(X 2 ) = E(X 3 ) = 1/2. It follows that E(X) = (1/2, 1/2, 1/2).

                        Expectation of a function of a random variable
                        Let X denote a random variable, possibly vector-valued, and let g denote a real-valued
                        function defined on X, the range of X. Let Y = g(X) and let H denote the distribution
                        function of Y. Then

                                                            ∞
                                                   E(Y) =     yd H(y)
                                                           −∞
                        provided that the integral exists. An important result is that we may also compute E(Y)by

                                                   E(Y) =   g(x) dF(x)                      (1.3)
                                                          X
                        so that the probability distribution of Y is not needed to compute its expected value.
                          When X has a discrete distribution, with frequency function p, proof of this result is
                        straightforward. Let f denote the frequency function of Y. Then

                                                    E(Y) =    yf (y).
                                                            y
                        Note that

                                    f (y) = Pr(Y = y) = Pr(X ∈{x: g(x) = y}) =  p(x).
                                                                          x:g(x)=y
   38   39   40   41   42   43   44   45   46   47   48