Page 227 - Probability and Statistical Inference
P. 227
204 4. Functions of Random Variables and Sampling Distribution
We denote y = x + x , y = x so that for this one-to-one transformation we
1
1
2
2
2
have x = y y , x = y where 0 < y < y < ∞. One can verify that | det(J)
2
2
1
1
2
1
2
| = 1, and hence the joint pdf of Y and Y would become
1 2
Then, from (4.4.21) we obtain the marginal pdf of Y as
1
In other words, Y = X + X has the Gamma(2, 1) distribution. We leave out
1
2
1
the intermediate steps as the Exercise 4.4.16. !
In the next example X , X and X are independent, but the
3
1
2
transformed variables Y , Y and Y are dependent.
3
2
1
Example 4.4.14 (Example 4.4.13 Continued) Suppose that X , X and X 3
2
1
are iid standard exponential random variables. Thus,
We denote y = x + x + x , y = x , y = x so that for this one-to-one
2
2
1
3
1
3
2
3
transformation we have x = y y y , x = y , x = y where 0 < y < y ∞,
1
2
2
2
3
3
1
3
2
1
0 < y < y < ∞ and y + y < y . One can verify that | det(J) | = 1, and hence
2
1
3
1
3
the joint pdf of Y , Y and Y would become
3
2
1
Then, from (4.4.22) we obtain the marginal pdf of Y as
1
In other words, Y = X + X + X has the Gamma(3, 1) distribution. We leave
2
1
1
3
out the intermediate steps as the Exercise 4.4.17. !
In the next example X and X are dependent, but the transformed
2
1
variables Y and Y are independent.
1 2
Example 4.4.15 Suppose that the random vector (X , X ) has the bivari-
2
1
2
2
ate normal distribution, N (0, 0, σ , σ , ρ) with 0 < σ < ∞, 1 < ρ < 1.
2