Page 152 - Probability and Statistical Inference
P. 152

3. Multivariate Random Variables  129

                           these two random variables are independent. But, we can not claim that Cov(X ,
                                                                                          1
                           X ) = 0 because E[X ] is not finite. !
                            2               2
                                    Two random variables X  and X  may be independent,
                                                         1
                                                               2
                                   but that may not necessarily imply that Cov(X , X ) = 0.
                                                                          1
                                                                              2
                                       This has been emphasized in the Example 3.5.5.
                              Example 3.5.6 What will happen to the conclusions in the Theorem
                           3.3.1, parts (i)-(ii), when the two random variables X  and X  are indepen-
                                                                               2
                                                                         1
                           dent? Let X  and X  be independent. Then, E {X  | X } = E[X ] which is a
                                     1
                                                                                 2
                                                                         1
                                                                  2|1
                                                                      2
                                           2
                           constant, so that E [E{X }] = E[X ]. Also, V [E {X  | X }] = V [E{X }] =
                                                                                  1
                                                                                       2
                                           1
                                                                        2
                                                2
                                                        2
                                                                  1
                                                                            1
                                                                    2|1
                           0 since the variance of a constant is zero. On the other hand,
                           V[X ] which is a constant. Thus, E [V {X  | X }] = E [V{X }] = V[X ].
                                                                     1
                                                                 2
                                                                                 2
                                                                                          2
                              2
                                                          1
                                                                            1
                                                             2|1
                           Hence, V [E {X  | X }] + E [V {X  | X }] simplifies to V[X ]. !
                                   1  2|1  2  1    1  2|1  2  1               2
                              We have more to say on similar matters again in the Section 3.7. We end
                           this section with yet another result which will come in very handy in verifying
                           whether some jointly distributed random variables are independent. In some
                           situations, applying the Definition 3.5.1 may be a little awkward because (3.5.1)
                           requires one to check whether the joint pmf or the pdf is the same as the
                           product of the marginal pmf’s or pdf’s. But, then one may not yet have
                           identified all the marginal pmf’s or pdf’s explicitly! What is the way out?
                           Look at the following result. Its proof is fairly routine and we leave it as the
                           Exercise 3.5.9.
                              Suppose that we have the random variables X , ..., X  where χ  is the
                                                                      1
                                                                                     i
                                                                             k
                           support of X , i = 1, ..., k. Next, we formalize the notion of the supports χ ’s
                                     i
                                                                                          i
                           being unrelated to each other.
                              Definition 3.5.2 We say that the supports χ ’s are unrelated to each other if
                                                                 i
                           the following condition holds: Consider any subset of the random variables,
                           X , j = 1, ..., p, p = 1, ..., k – 1. Conditionally, given that X  = x , the support
                                                                                ij
                                                                            ij
                            ij
                           of the remaining random variable X  stays the same as χ  for any l ≠ i , j = 1,
                                                                                     j
                                                                          l
                                                         l
                           ..., p, p = 1, ..., k – 1.
                              Theorem 3.5.3 Suppose that the random variables X , ..., X  have the
                                                                                   k
                                                                            1
                           joint pmf or pdf f(x , ..., x ), x  ∈ χ , the support of X , i = 1, ..., k. Assume
                                                 k
                                           1
                                                     i
                                                                         i
                                                         l
                           that these supports are unrelated to each other according to the Definition
                           3.5.2. Then, X , ..., X  are independent if and only if
                                       1     k
                           for some functions h (x ), ..., h (x ).
                                             1  1    k  k
   147   148   149   150   151   152   153   154   155   156   157