Page 153 - Probability and Statistical Inference
P. 153

130    3. Multivariate Random Variables

                                    Example 3.5.7 Suppose that the joint pdf of X , X  is given by
                                                                          1  2


                                 for some θ > 0 where k(> 0) is a constant. Obviously, we can write f(x , x ) =
                                                                                               2
                                                                                            1
                                 kh (x )h (x ) for all positive numbers x , x  where h (x ) = exp{–θx } and h (x )
                                                                                       1
                                                                 1
                                                                   2
                                                                           1
                                                                             1
                                       2
                                   1
                                     1
                                          2
                                                                                              2
                                                                                                2
                                                                   +
                                 = exp{–1/θx }. Also the supports χ  = ℜ , χ  = ℜ  are unrelated to each other.
                                                                          +
                                                              1
                                                                     2
                                           2
                                 Hence, by appealing to the Theorem 3.5.3 we can claim that X  and X  are
                                                                                              2
                                                                                       1
                                 independent random variables. One should note two interesting facts here. The
                                 two chosen functions h (x ) and h (x ) are not even probability densities to
                                                              2
                                                                2
                                                     1
                                                       1
                                 begin with. The other point is that the choices of these two functions are not
                                                                                 –1
                                 unique. One can easily replace these by kdh (x ) and kd h (x ) respectively
                                                                                      2
                                                                                   2
                                                                       1
                                                                         1
                                 where d(> 0) is any arbitrary number. Note that we did not have to determine k
                                 in order to reach our conclusion that X  and X  are independent random vari-
                                                                  1     2
                                 ables. !
                                    Example 3.5.8 (Example 3.5.7 Continued) One can check that f (x ) = θ
                                                                                          1
                                                                                             1
                                                      –1
                                 exp{–θx } and f (x ) = θ exp{–1/θ . x }, and then one may apply (3.5.1) in-
                                                                 2
                                        1
                                              2
                                                 2
                                 stead to arrive at the same conclusion. In view of the Theorem 3.5.3 it becomes
                                 really simple to write down the marginal pdf’s in the case of independence. !
                                    Example 3.5.9 Suppose that the joint pdf of X , X , X  is given by
                                                                          1  2  3
                                 where c(> 0) is a constant. One has f(x , x , x ) = ch (x )h (x )h (x ) with
                                                                                            3
                                                                               1
                                                                                  1
                                                                    1
                                                                                         3
                                                                         3
                                                                                       2
                                                                                    2
                                                                       2
                                                        exp{–2x },             and h (x ) = exp
                                                                                     3
                                                         1
                                                                                   3
                                  , for all values of x  ∈ ℜ , (x , x ) Î ℜ . Also, the supports χ  ∈ ℜ , χ  = ℜ, χ 3
                                                                2
                                                      +
                                                                                       +
                                                 1
                                                            3
                                                                                   1
                                                                                          2
                                                         2
                                 = ℜ are unrelated to each other. Hence, by appealing to the Theorem 3.5.3 we
                                 can claim that X , X  and X  are independent random variables. One should
                                                         3
                                               1
                                                  2
                                 again note the interesting facts. The three chosen functions h (x ), h (x ) and
                                                                                     1
                                                                                             2
                                                                                       1
                                                                                          2
                                 h (x ) are not even probability densities to begin with. The other point is that the
                                  3  3
                                 choices of these three functions are not unique. !
                                    Example 3.5.10 (Example 3.5.9 Continued) One should check these out:
                                 By looking at h (x ) one can immediately guess that X  has the Gamma(α =
                                                 1
                                                                               1
                                              1
                                 1/2, β = 1/2) distribution so that its normalizing constant           .   By
                                 looking at h (x ) one can immediately guess that X  has the Cauchy dis-
                                            2
                                              2
                                                                               2
                                 tribution so that its normalizing constant 1/π. Similarly, by looking at
                                 h (x ) one can immediately guess that X  has the N(0, 1/2π) distribution
                                     3
                                                                     3
                                  3
                                 so that its normalizing constant is unity. Hence,              (1/π)(1)=
   148   149   150   151   152   153   154   155   156   157   158