Page 145 - Probability and Statistical Inference
P. 145
122 3. Multivariate Random Variables
When we consider two random variables X and X only, we may supress the
1
2
subscripts from ρ , and simply write ρ instead.
X1 X2
Definition 3.4.3 Two random variables X , X are respectively called
2
1
negatively correlated, uncorrelated, or positively correlated if and only if ρ ,
X1
negative, zero or positive.
X2
Before we explain the role of a correlation coefficient any further, let us
state and prove the following result.
Theorem 3.4.2 Consider any two discrete or continuous random vari-
ables X and X for which we can assume that ∞ < Cov(X , X ) < ∞, 0 <
2
1
2
1
V(X ) < ∞ and 0 < V(X ) < ∞. Let ρ , , defined by (3.4.8), stand for the
X1
1
2
X2
correlation coefficient between X and X . We have the following results:
1 2
(i) Let Y = c + d X where ∞ < c < ∞ and 0 < d < ∞ arefixed
i
i
i
i
i
i
numbers, i = 1, 2. Then, ρ , = ρ , ;
Y1 Y2 X1 X2
(ii) |ρ , | ≤ 1;
X1 X2
(iii) In part (ii), the equality holds, that is ρ is +1 or 1, if and only
if X and X are linearly related. In other words, ρ , is +1or
1 2 X1 X2
1 if and onlyif X = a + bX w.p.1 for some real numbers a and b.
1 2
Proof (i) We apply the Theorem 3.3.2 and Theorem 3.4.1 to claim that
Also, we have
Next we combine (3.4.8)-(3.4.10) to obtain
(ii) We apply the Cauchy-Schwarz inequality (Theorem 3.9.5) or directly
the covariance inequality (Theorem 3.9.6) from the Section 3.9 and imme-