Page 170 - Applied Statistics And Probability For Engineers
P. 170

c05.qxd  5/13/02  1:49 PM  Page 146 RK UL 6 RK UL 6:Desktop Folder:TEMP WORK:MONTGOMERY:REVISES UPLO D CH114 FIN L:Quark Files:






               146     CHAPTER 5 JOINT PROBABILITY DISTRIBUTIONS


                                 The calculation using the joint probability distribution can be used to determine E(X) even in
                                 cases in which the marginal probability distribution of X is not known. As practice, you can
                                 use the joint probability distribution to verify that E(Y)   0.32 in Example 5-1.
                                    Also,

                                                    V1X 2   np11   p2   410.9211   0.92   0.36

                                 Verify that the same result can be obtained from the joint probability distribution of X and Y.



               5-1.3  Conditional Probability Distributions

                                 When two random variables are defined in a random experiment, knowledge of one can change
                                 the probabilities that we associate with the values of the other. Recall that in Example 5-1, X
                                 denotes the number of acceptable bits and Y denotes the number of suspect bits received by a
                                 receiver. Because only four bits are transmitted, if X   4, Y must equal 0. Using the notation for
                                 conditional probabilities from Chapter 2, we can write this result as P(Y   0 X   4)   1. If
                                 X   3, Y can only equal 0 or 1. Consequently, the random variables X and Y can be considered
                                 to be dependent. Knowledge of the value obtained for X changes the probabilities associated
                                 with the values of Y.
                                    Recall that the definition of conditional probability for events  A and B is  P1B ƒ A2
                                 P1A ¨ B2	P1A2  . This definition can be applied with the event A defined to be X   x and event
                                 B defined to be Y   y.

               EXAMPLE 5-5       For Example 5-1, X and Y denote the number of acceptable and suspect bits received, respec-
                                 tively. The remaining bits are unacceptable.

                                               P1Y   0 ƒ X   32   P1X   3, Y   02	P1X   32
                                                             f XY  13, 02	f X  132   0.05832	0.2916   0.200

                                 The probability that Y   1 given that X   3 is

                                             P1Y   1 ƒ X   32   P1X   3, Y   12	P1X   32
                                                                 13, 12	f   132   0.2333	0.2916   0.800
                                                              f XY    X
                                 Given that X   3, the only possible values for Y are 0 and 1. Notice that P(Y   0 X   3)
                                 P(Y   1 X   3)   1. The values 0 and 1 for Y along with the probabilities 0.200 and 0.800
                                 define the conditional probability distribution of Y given that X   3.

                                    Example 5-5 illustrates that the conditional probabilities that Y   y given that X   x can be
                                 thought of as a new probability distribution. The following definition generalizes these ideas.


                       Definition
                                    Given discrete random variables X and Y with joint probability mass function f XY (x, y)
                                    the conditional probability mass function of Y given X   x is
                                                      1y2   f   1x, y2	f   1x2   for  f   1x2 
 0   (5-4)
                                                   f Y 0 x   XY    X          X
   165   166   167   168   169   170   171   172   173   174   175