Page 146 - Probability and Statistical Inference
P. 146
3. Multivariate Random Variables 123
diately note that
From (3.4.12) we conclude that
Hence, one has |ρ , | = 1.
X1 X2
(iii) We conclude that , X2 = 1 provided that we have equality throughout
(3.4.13). The covariance inequality (Theorem 3.9.6) dictates that we can
have equality in (3.4.13) if and only if the two random variables (X µ ) and
1
1
(X µ ) are linearly related w.p.1. In other words, one will have , X2 = 1
2
2
if and only if (X µ ) = c(X µ ) + d w.p.1 where c and d are two real
2
1
1
2
numbers. The result follows with a = µ cµ + d and b = c. One may
1 2
observe that ρ , = +1 or 1 according as b is positive or negative. ¢
X1 X2
From the Definition 3.4.3, recall that in the case when ρ , = 0, the two
X1 X2
random variables X , X are called uncorrelated. The case of zero correlation
1
2
is addressed in more detail in the Section 3.7.
If the correlation coefficient ρ is one in magnitude, then the
two random variables are linearly related with probability one.
The zero correlation is referred to as uncorrelation.
Example 3.4.4 (Example 3.4.1 Continued) One can check that V(X ) = 6.4971
1
and V(X ) = .6825. We also found earlier that Cov(X , X ) = 0.2415. Thus, using
2
1
2
(3.4.8) we get ρ , = Cov(X , X )/(σ σ ) = ≈
1 2
2
1
X1 X2
. 11468. !
Example 3.4.5 (Example 3.4.2 Continued) We already had shown that
Cov(X , X ) = 0. One can also check that both V(X ) and V(X ) are finite, and
1
2
2
1
hence ρ , = 0. !
X1 X2
Example 3.4.6 (Example 3.4.3 Continued) We already had shown that
Cov(X , X ) = 3/160 whereas E(X ) = 3/4, E(X ) = 3/8. Proceeding analo-
1
1
2
2
gously, one can check that
Thus, V(X ) = 3/5 (3/4) = 3/80 and V(X ) = 1/5 (3/8) = 19/320.
2
2
1 2
Hence, ρ = (3/160) / ≈ .39736. !
X1, X2