Page 171 - Probability and Statistical Inference
P. 171
148 3. Multivariate Random Variables
Example 3.9.4 Suppose that X is a random variable whose mgf M (t) is
X
finite for some t ∈ T ⊆ (∞, 0). Then, it follows from the Theorem 3.9.2 that
for any fixed real number a, one can claim:
ta
{e M (t)}. Its verification is left as the Exercise 3.9.1. !
X
The following section provides yet another application of the Markov
inequality.
3.9.2 Tchebysheffs Inequality
This inequality follows from a more general inequality (Theorem 3.9.4)
which is stated and proved a little later. We take the liberty to state the simpler
version separately for its obvious prominence in the statistical literature.
Theorem 3.9.3 (Tchebysheffs Inequality) Suppose that X is a real val-
ued random variable with the finite second moment. Let us denote its mean ì
and variance α (> 0). Then, for any fixed real number ε(> 0), one has
2
We know that P{|X µ| < kσ} = 1 P{|X µ| ≥ kσ}. Thus, with k > 0,
if we substitute ε = kσ in (3.9.7), we can immediately conclude:
In statistics, sometimes (3.9.8) is also referred to as the Tchebysheffs in-
equality. Suppose we denote p = P{|X µ| < kσ}. Again, (3.9.7) or equiva-
k
lently (3.9.8) provide distribution-free bounds for some appropriate probabil-
ity. Yet, let us look at the following table:
Table 3.9.1. Values of p and the Tchebysheffs Lower Bound (3.9.8)
k
k = 1 k = 2 k = 3 k = 4
Tchebysheffs Bound 0 3/4= .75 8/9 ≈ .88889 15/16 = .9375
p : X is N(0, 1) .68268 .95450 .99730 .99994
k
In the case of the standard normal distribution, the Tchebysheffs lower bound
for pk appears quite reasonable for k = 3, 4. In the case k = 1, the Tchebysheffs
inequality provides a trivial bound whatever be the distribution of X.