Page 353 - Elements of Distribution Theory
P. 353
P1: JZP
052184472Xc11 CUNY148/Severini May 24, 2005 17:56
11.3 Convergence in Probability 339
where θ 1 ,θ 2 ,... denotes a sequence of constants each taking values in the interval (0, 1).
For any > 0,
Pr(|X n |≥ ) = Pr(X n = 1) = θ n ;
p
hence, X n → 0 provided that lim n→∞ θ n = 0.
Example 11.12 (Normal random variables).
Let Z, Z 1 , Z 2 ,... denote independent random variables, each with a standard normal
distribution, and let α 1 ,α 2 ,... denote a sequence of real numbers satisfying 0 <α n < 1
for n = 1, 2,... and α n → 1as n →∞. Let
X n = (1 − α n )Z n + α n Z, n = 1, 2,...
and let X = Z. Then, for any m = 1, 2,..., (X 1 ,..., X m ) has a multivariate normal dis-
tribution with mean vector and covariance matrix with (i, j)th element given by α i α j ,if
2
2
i = j and by (1 − α j ) + α if i = j.
j
Note that X n − X = (1 − α n )(Z n − Z)so that, for any > 0,
Pr{|X n − X|≥ }= Pr{|Z n − Z|≥ /(1 − α n )}
2
so that, by Markov’s inequality, together with the fact that E[|Z n − Z| ] = 2,
2(1 − α n ) 2
Pr{|X n − X|≥ }≤ 2 , n = 1, 2,....
p
It follows that X n → X as n →∞.
As noted above, an important distinction between convergence of a sequence X n , n =
1, 2,..., to X in distribution and in probability is that convegence in distribution depends
only on the marginal distribution functions of X n and of X, while convergence in probability
is concerned with the distribution of |X n − X|. Hence, for convergence in probability, the
joint distribution of X n and X is relevant. This is illustrated in the following example.
Example 11.13 (Sequence of Bernoulli random variables). Let X 1 , X 2 ,... denote a
sequence of real-valued random variables such that, for each n = 1, 2,...,
1 n + 1
Pr(X n = 1) = 1 − Pr(X n = 0) =
2 n
and let X denote a random variable satisfying
Pr(X = 1) = Pr(X = 0) = 1/2.
D
Then, by Example 11.1, X n → X as n →∞.
However, whether or not X n converges in X in probability will depend on the joint
distributions of (X, X 1 ), (X, X 2 ),.... For instance, if, for each n, X n and X are independent,

