Page 179 - Probability and Statistical Inference
P. 179
156 3. Multivariate Random Variables
so that E[X] ≥ max(E[X ], E[X ]). !
1 2
Example 3.9.14 Suppose that X is exponentially distributed with mean λ >
0. Then, what can we say about E[X ]? Look at the function
1/2
//
for x > 0. This function is concave because f (x) = 1/4 x < 0. Hence, from the
3/2
Jensens inequality we can immediately claim that
Similarly, one can also show that E[log(X)] ≥ log(E[X]) = log(λ). !
What follows is yet another nice application of the Jensens inequality.
Theorem 3.9.8 (Lyapunovs Inequality) Suppose that X is a real valued
random variable. Let m = E [|X| ]. Then, m is increasing for r ≥ 1.
1/r
r
r r
Proof We want to show that
Observe that
r/s
using Jensens inequality since f(x) = x , x > 0, is convex. Thus, one has
which is the desired result. ¢
3.9.5 Hölders Inequality
In the Cauchy-Schwarz inequality (3.9.10), the upper bound consisted of the
product of E[X ] and E[Y ]. However, what if we have a situation like this:
2
2
For the random variable X, higher than second moments may be assumed
finite, but for the random variable Y, lower than second moments may be
assumed finite. Under this scenario, the Cauchy-Schwarz inequality does not
help much to obtain an upper bound for E [X X ]. The following inequality is
2
1
2
more flexible. For brevity, we do not supply its proof.
Theorem 3.9.9 (Hölders Inequality) Let X and Y be two real valued
1
1
random variables. Then, with r > 1, s > 1 such that r + s = 1, one has
provided that E[|X| ] and E[|Y| ] are finite.
r
s
Example 3.9.15 Suppose that we have two dependent random vari-
ables X and Y where X is distributed as N(0, 1), but the pdf of Y is given by