Page 206 - Applied statistics and probability for engineers
P. 206
184 Chapter 5/Joint Probability Distributions
Determine the following: 5-57. Suppose that X has a standard normal distribution. Let
(a) Covariance between X and Y the conditional distribution of Y given X = x be normally dis-
(
(b) Marginal probability distribution of X (c) P X < 116) tributed with mean E(Y x| ) = 2x and variance V Y x) = 2 x.
|
(
(d) Conditional probability distribution of X given that Y = 102 Determine the following.
(
(e) P X < 116 | Y = 102) (a) Are X and Y independent? (b) P Y < 3 | X = 3)
(
5-55. In the manufacture of electroluminescent lamps, (c) E Y | X = 3( ) (d) f XY ( x,y)
,
several different layers of ink are deposited onto a plastic sub- (e) Recognize the distribution f XY ( x y) and identify the mean
strate. The thickness of these layers is critical if speciications and variance of Y and the correlation between X and Y .
regarding the inal color and intensity of light are to be met. 5-58. Suppose that X and Y have a bivariate normal distribution
ρ
Let X and Y denote the thickness of two different layers of ink. with joint probability density function f XY ( x,y;σ X ,σ Y ,μ X ,μ Y , )
It is known that X is normally distributed with a mean of 0.1 (a) Show that the conditional distribution of Y given that X = x
millimeter and a standard deviation of 0.00031 millimeter, and is normal.
| (
Y is normally distributed with a mean of 0.23 millimeter and (b) Determine E Y X = x).
| (
a standard deviation of 0.00017 millimeter. The value of ρ for (c) Determine V Y X = x).
these variables is equal to 0. Speciications call for a lamp to 5-59. If X and Y have a bivariate normal distribution with
have a thickness of the ink corresponding to X in the range of ρ = 0, show that X and Y are independent.
0.099535 to 0.100465 millimeter and Y in the range of 0.22966
to 0.23034 millimeter. What is the probability that a randomly 5-60. Show that the probability density function f XY
selected lamp will conform to speciications? (x,y;σ X ,σ Y ,μ X ,μ Y ,ρ ) of a bivariate normal distribution inte-
grates to 1. [Hint: Complete the square in the exponent and use
5-56. Patients given drug therapy either improve, remain the
same, or degrade with probabilities 0.5, 0.4, 0.1, respectively. the fact that the integral of a normal probability density func-
Suppose that 20 patients (assumed to be independent) are given tion for a single variable is 1.]
1 ,
the therapy. Let X X 2 , and X 3 denote the number of patients 5-61. If X and Y have a bivariate normal distribution with
ρ
who improved, stayed the same, or became degraded. Deter- joint probability density f XY ( x,y;σ X ,σ Y ,μ X ,μ Y , ), show that
mine the following. the marginal probability distribution of X is normal with mean
,
(a) Are X X X 3 independent? (b) P X 1 = 10) μ x and standard deviation σ x . [Hint: Complete the square in the
(
2
, 1
(
(c) P X( 1 = 10, X = 8, X = 2) (d) P X 1 = | 5 X 2 = 12 ) exponent and use the fact that the integral of a normal prob-
2
3
) ability density function for a single variable is 1.]
(e) E(X 1
5-4 Linear Functions of Random Variables
A random variable is sometimes deined as a function of one or more random variables. In this
section, results for linear functions are highlighted because of their importance in the remain-
der of the book. For example, if the random variables X 1 and X 2 denote the length and width,
respectively, of a manufactured part, Y = 2X +2X 2 is a random variable that represents the
1
perimeter of the part. As another example, recall that the negative binomial random variable
was represented as the sum of several geometric random variables.
In this section, we develop results for random variables that are linear combinations of
random variables.
Linear Combination
Given random variables X ,X ,… ,X p and constants c ,c ,… ,
1
2
1
c p
2
Y = c X + c X + …+ c X p (5-24)
2
p
2
1
1
… .
is a linear combination of X , X , , X p1 2
Now E Y ( ) can be found from the joint probability distribution of X ,X ,… ,X p as follows.
1
2
Assume that X X X p1 , 2 , are continuous random variables. An analogous calculation can be
used for discrete random variables.
∞ ∞ ∞
c x +
∫
E Y ( ) = ∫ ∫ … ( 1 1 c x + … + c ) X X …X p ( 1 x , x , … ) dx dx 2 …dx
,
p f
p x
p x
p
2 2
2
1
1 2
−∞ −∞ − −∞
∞ ∞ ∞
= c 1 ∫ ∫ … ∫ ∫ x f X X1 1 2 …X p (x , x , … ) dx dx 2 …dx p
2
1
1
, x p
−∞ −∞ −∞