Page 506 - Discrete Mathematics and Its Applications
P. 506

7.4 Expected Value and Variance  485


                                                        Now let X be the random variable equal to the number of flips in an element in the
                                                     sample space. That is, X(T ) = 1, X(HT)= 2, X(HHT)= 3, and so on. Note that p(X = j) =
                                                     (1 − p) j−1 p. The expected number of flips until the coin comes up tails equals E(X).
                                                        Using Theorem 1, we find that

                                                             ∞                ∞                   ∞
                                                                                        j−1                 j−1       1    1
                                                     E(X) =     j · p(X = j) =   j(1 − p)  p = p     j(1 − p)   = p ·   =   .
                                                                                                                     p 2   p
                                                            j=1              j=1                 j=1
                                                     [The third equality in this chain follows from Table 2 in Section 2.4, which tells us
                                                           ∞          j−1                 2      2
                                                     that  j=1  j(1 − p)  = 1/(1 − (1 − p)) = 1/p .] It follows that the expected num-
                                                     ber of times the coin is flipped until tails comes up is 1/p. Note that when the
                                                     coin is fair we have p = 1/2, so the expected number of flips until it comes up tails
                                                     is 1/(1/2) = 2.                                                                ▲
                                                        The random variable X that equals the number of flips expected before a coin comes up
                                                     tails is an example of a random variable with a geometric distribution.


                                   DEFINITION 2       A random variable X has a geometric distribution with parameter p if p(X = k) =
                                                      (1 − p) k−1 p for k = 1, 2, 3,..., where p is a real number with 0 ≤ p ≤ 1.


                                                     Geometric distributions arise in many applications because they are used to study the time
                                                     required before a particular event happens, such as the time required before we find an object
                                                     with a certain property, the number of attempts before an experiment succeeds, the number of
                                                     times a product can be used before it fails, and so on.
                                                        When we computed the expected value of the number of flips required before a coin comes
                                                     up tails, we proved Theorem 4.

                                     THEOREM 4        If the random variable X has the geometric distribution with parameter p, then E(X) = 1/p.





                                                     Independent Random Variables

                                                     We have already discussed independent events.We will now define what it means for two random
                                                     variables to be independent.


                                   DEFINITION 3       The random variables X and Y on a sample space S are independent if

                                                         p(X = r 1 and Y = r 2 ) = p(X = r 1 ) · p(Y = r 2 ),

                                                      or in words, if the probability that X = r 1 and Y = r 2 equals the product of the probabilities
                                                      that X = r 1 and Y = r 2 , for all real numbers r 1 and r 2 .


                                     EXAMPLE 11     Are the random variables X 1 and X 2 from Example 4 independent?

                                                     Solution: Let S ={1, 2, 3, 4, 5, 6}, and let i ∈ S and j ∈ S. Because there are 36 possible
                                                     outcomes when the pair of dice is rolled and each is equally likely, we have

                                                        p(X 1 = i and X 2 = j) = 1/36.
   501   502   503   504   505   506   507   508   509   510   511