Page 172 - Probability and Statistical Inference
P. 172
3. Multivariate Random Variables 149
Theorem 3.9.4 Suppose that X is a real valued random variable such that
with some r > 0 and T ∈ T(⊆ ℜ), one has ø = E{|X T| } which is finite. Then,
r
r
for any fixed real number ε(> 0), one has
Proof Note that P{|X T| ≥ ε} = P{W ≥ ε } where W = |X T| . Now, the
r
r
inequality (3.9.9) follows immediately by invoking the Markov inequality. ¢
The Tchebysheffs inequality follows immediately
from (3.9.9) by substituting r = 2 and T = µ.
3.9.3 Cauchy-Schwarz and Covariance Inequalities
If we have independent random variables X and X , then we know from the
2
1
Theorem 3.5.1 that E[X X ] = E[X ]E[X ]. But, if X and X are dependent,
1
2
2
1
2
1
then it is not always so simple to evaluate E[X X ]. The Cauchy-Schwarz
1
2
inequality allows us to split E[X X ] in the form of an upper bound having two
2
1
separate parts, one involving only X and the other involving only X .
1 2
Theorem 3.9.5 (Cauchy-Schwarz Inequality) Suppose that we have
two real valued random variables X and X , such that , and
1 2
E[X X ] are all finite. Then, we have
1 2
In (3.9.10), the equality holds if and only if X = kX w.p.1 for some constant
1
2
k.
Proof First note that if , then X = 0 w.p.1 so that both sides
2
of (3.9.10) will reduce to zero. In other words, (3.9.10) holds when .
Now we assume that . Let λ be any real number. Then we can write
But note that (X + λ X ) is a non-negative random variable whatever be λ,
2
1
2
and so E[(X + λX ) ] ≥ 0 whatever be λ. If we substitute λ ≡ λ =
2
1 2 0