Page 226 - Elements of Distribution Theory
P. 226
P1: JZX
052184472Xc07 CUNY148/Severini May 24, 2005 3:59
212 Distribution Theory for Functions of Random Variables
Hence, the marginal density of Y 1 is
1−y 1 dy 2 if 0 ≤ y 1 < 1
0 1 − y 1 if 0 ≤ y 1 < 1
=
1 1 + y 1 if −1 < y 1 < 0
dy 2 if −1 < y 1 < 0
−y 1
= 1 −|y 1 |, |y 1 | < 1.
It follows that the distribution of X 2 − X 1 under parameter value (θ 1 ,θ 2 ) has density
|t| 1
1 − , |t| <θ 2 − θ 1 .
θ 2 − θ 1 θ 2 − θ 1
7.4 Sums of Random Variables
Let X 1 ,..., X n denote a sequence of real-valued random variables. We are often interested
n
inthedistributionof S = X j .Wheneverthedistributionof X isabsolutelycontinuous,
j=1
the distribution of S may be determined using Theorem 7.2. However, the distribution of a
sum arises so frequently that we consider it in detail here; in addition, there are some results
that apply only to sums.
We begin by considering the characteristic function of S.
Theorem 7.4. Let X = (X 1 ,..., X n ) where X 1 , X 2 ,..., X n denote real-valued random
variables. Let ϕ X denote the characteristic function of X and let ϕ S denote the characteristic
n
function of S = X j . Then
j=1
ϕ S (t) = ϕ X (tv), t ∈ R
n
where v = (1,..., 1) ∈ R .
Proof. The characteristic function of X is given by
T
n
ϕ X (t) = E[exp{it X}], t ∈ R .
T
Since S = v X, the characteristic function of S is given by
T
ϕ S (t) = E[exp{itv X}] = ϕ X (tv), t ∈ R,
verifying the theorem.
Example 7.15 (Sum of exponential random variables). Let X 1 ,..., X n denote indepen-
dent, identically distributed random variables, each with density function
λ exp{−λx}, x > 0
where λ> 0; this is the density function of the exponential distribution with parameter λ.
The characteristic function of this distribution is given by
λ
∞
ϕ(t) = exp(itx)λ exp(−λx) dx = , −∞ < t < ∞.
0 (λ − it)