Page 193 - Probability and Statistical Inference
P. 193
170 3. Multivariate Random Variables
One will recall from (1.7.27) that this pdf is known as the lognormal density
and the corresponding X is called a lognormal random variable. Suppose that
X , X are iid having the common pdf f(x). Let r and s be arbitrary, but fixed
1
2
real numbers. Then, obtain the expression for
3.6.1 Derive the marginal pdf of X in the Theorem 3.6.1, part (i).
1
3.6.2 Prove Theorem 3.6.1, part (iii).
3.6.3 (Example 3.6.2 Continued) Suppose that (X , X ) is distributed as
1
2
N (0, 0, 1, 1, ρ) where one has ρ ∈ (1, 1). Find the expression of the mgf of
2
the random variable X X , that is E{exp[tX X ]} for t belonging to an appro-
2
2
1
1
priate subinterval of ℜ.
3.6.4 Suppose that the joint pdf of (X , X ) is given by
1 2
for ∞ < x , x < ∞. Evaluate E[X ], V[X ] for i = 1, 2, and ρ , .
1 2 i i X1 X2
3.6.5 Suppose that the joint pdf of (X , X ) is given by
1 2
for ∞ < x , x < ∞ where k is a positive number. Evaluate E[X ], V[X ]
1 2 i i
for i = 1, 2, and ρ , .
X1 X2
3.6.6 Suppose that (X , X ) is distributed as N (3, 1, 16, 25, 3/5). Evaluate
1
2
2
P{3 < X < 8 | X = 7} and P{3 < X < 3 | X = 4}.
2 1 1 2
3.6.7 Suppose that X is distributed as N(µ, σ ) and conditionally the distri-
2
1
2
bution of X given that X = x is N(x , σ ). Then, show that the joint distribu-
1
1
2
1
tion of (X , X ) is given by N (µ, µ, σ , 2σ , ).
2
2
1 2 2
3.6.8 Suppose that the joint pdf of (X , X ) is given by
1 2
for ∞ < x , x < ∞.
1 2
(i) By direct integration, verify that f(x , x ) is a genuine pdf;
1 2
(ii) Show that the marginal distributions of X , X are both univariate
1
2
normal;
(iii) Does this f(x , x ) match with the density given by (3.6.1)?
1 2
3.6.9 Suppose that the joint pdf of (X , X , X ) is given by
1 2 3