Page 118 - Probability and Statistical Inference
P. 118
2. Expectations of Functions of Random Variables 95
α
α
α1
r
f(x) = αβ x exp([x/β] )I(x > 0) where α(> 0) and β(> 0). Evaluate E(X )
α
for any arbitrary but fixed r > 0. {Hint: Try the substitution u = [x/β] during
the integration.}
2.4.1 In this exercise, you are given the expressions of the mgf of differ-
ent random variables. In each case, (i) identify the random variable either by
its standard name or by explicitly writing down its pmf or the pdf, (ii) find the
values of both µ and σ for the random variable X.
(i) M (t) = e , for t ∈ ℜ;
5t
X
(ii) M (t) = 1, for t ∈ ℜ;
X
(iii) M (t) = 1/2(1 + e ), for t ∈ ℜ;
t
X
(iv) M (t) = 1/3(e + 1 + e ), for t ∈ ℜ;
t
2t
X
(v) M (t) = 1/10(e + 3 + 6e ), for t ∈ ℜ;
4t
2t
X
(vi) M (t) = 1/2401(3e + 4) , for t ∈ ℜ.
t
4
X
{Hint: Think of a discrete random variable and how one actually finds its
mgf. Then, use Theorem 2.4.1.}
2.4.2 In this exercise, you are given the expressions for the mgf of a
random variable X. In each case, (i) identify the random variable X either by
its standard name or by explicitly writing down its pmf or the pdf, (ii) find the
values of both µ and σ for the random variable X.
tX
(i) E[e ] = e 25t2 , for t ∈ ℜ;
tX
t2
(ii) E[e ] = e , for t ∈ ℜ;
(iii) M (t) = (1 6t + 9t ) , for t < 1/3.
2 2
X
{Hint: Think of a continuous random variable from the Section 1.7 and
see if that helps in each part. Then, use Theorem 2.4.1.}
2.4.3 A random variable X has its mgf given by
2
Find µ and σ . Can the distribution of X be identified?
r
r
2.4.4 (Example 2.4.4 Continued) Show that E(X )=E(Y ) for all r=1,
x exp[1/2(log(x)) ]I(x > 0) and
2, ... where the pdf of X is f(x)=(2π) 1/2 1 2
that of Y is g(y) = f(y)[1+csin(2πlog(y))]I(y > 0). Here, c is a fixed num-
ber, 1 ≤ c ≤ 1 and c ≠ 0. {Hint: Note that the pdf f(x) actually matches
with the lognormal density from (1.7.27). When handling g(y), first show
that the multiplier of f(y) is always positive. Then while evaluating
g(y)dy, all one needs to show is that In