Page 295 - Probability and Statistical Inference
P. 295
272 5. Concepts of Stochastic Convergence
5.2.10 Consider a sequence of real valued random variables {T ; n > k},
n
and suppose that as n → ∞. Is it true that for some real
number a, the random variable as n → ∞? {Hint: Can one apply
Slutskys Theorem after taking the natural logarithm?}
5.2.11 (Exercise 5.2.10 Continued) Let X , ..., X be iid Poisson(λ) with λ
1 n
> 0, and let for n > k. Show that as n →
∞. Also, find the number c(> 0) such that as n → ∞.
5.2.12 Consider a sequence of real valued random variables {T ; n ≥ 1},
n
and suppose that as n → ∞. Let us define X = I(T > 1/2a),
n n
n ≥ 1. Does as n → ∞? Suppose that Y = I(T > 3/2a), n ≥ 1.
n n
Does as n → ∞? {Hint: Try and apply the Definition 5.2.1 di-
rectly. Can one use the Markov inequality here?}
5.2.13 (i) Consider the two sequences of random variables {U ; n ≥ 1}
n
and {V ; n ≥ 1} respectively defined in (5.2.6)-(5.2.7). Verify that
n
as n → ∞. Using Theorem 5.2.4, it will immediately follow that
and as n → ∞.
(ii) Additionally suppose that U and V are independent for all n ≥ 1. In
n
n
this situation, first obtain the probability distributions of and U V .
n
n
Hence, show directly, that is without appealing to Slutskys Theorem, that
and as n → ∞.
5.2.14 Suppose that (X , Y ), i = 1, ..., 2n, are iid N (0, 0, 1, 1, ρ) with 1
2
i
i
< ρ < 1. Recall the bivariate normal distribution from the Section 3.6. Let us
denote
with i = 1, 2, ..., n. Consider now the sample mean , and
denote
(i) Show that the U s are iid Bernoulli with p = 1/2 (1 + ρ );
i
(ii) Show that as n → ∞.
{Hint: Note tht p = P(U = 1) = P(X Y + X Y > 0). But, one can
1
1
1
2
2
2
write, for example, X Y = ¼{(X + Y ) (X Y ) }, so that p = P{(X +
2
1
1
1
1
1
1
1
Y ) (X Y ) + (X + Y ) (X Y ) > 0} = P{(X + Y ) + (X + Y ) 2
2
2
2
2
2
1
1
1
2
1
2
1
2
2
2
2
2
> (X Y ) + (X Y ) }. Verify that U = (X + Y ) + (X + Y ) is
2
2
2
2
2
1
1
2
2
1
1
2
independent of V = (X Y ) + (X Y ) . Find the distributions of U
2
2
2
1
1
and V. Then, rewrite p as the probability of an appropriate event defined