Page 83 -
P. 83

In-Class Exercise

                             Pb. 3.19 In a measurement of two power values, P  and P , it was deter-
                                                                            1
                                                                                   2
                             mined that:
                                                 G  = 9 dB  and G  = –11 dB
                                                  1
                                                                  2
                             Using the above table, determine the value of the ratio P /P .
                                                                                 2
                                                                              1


                             3.6.3  Entropy

                             Given a random variable X (such as the number of spots on the face of a
                             thrown die) whose possible outcomes are x , x , x , …, and such that the
                                                                     1
                                                                           3
                                                                        2
                             probability for each outcome is, respectively, p(x ), p(x ), p(x ), … then, the
                                                                              2
                                                                        1
                                                                                   3
                             entropy for this system described by the outcome of one random variable is
                             defined by:
                                                           N
                                                  H X( ) =− ∑ px( )log ( px( ))            (3.14)
                                                                         i
                                                                i
                                                                    2
                                                          i=1
                             where N is the number of possible outcomes, and the logarithm is to base 2.
                              The entropy is a measure of the uncertainty in the value of the random vari-
                             able. In Information Theory, it will be shown that the entropy, so defined, is
                             the number of bits, on average, required to describe the random variable X.



                             In-Class Exercises

                             Pb. 3.20 In each of the following cases, find the entropy:
                                a. N = 32 and  px() =  1  for all i
                                                 i
                                                    32
                                b. N = 8 and  p =   1 1 1 1  ,  1  ,  1  ,  1  ,  1    
                                                   ,
                                                     , ,
                                                
                                                2 4 8 16 64 64 64 64
                                c. N = 4 and  p =   1 1 1 1   
                                                   ,
                                                     , ,
                                                
                                                2 4 8 8
                                d. N = 4 and  p =   1 1 1  0 ,    
                                                   ,
                                                     ,
                                                
                                                2 4 4
                             © 2001 by CRC Press LLC
   78   79   80   81   82   83   84   85   86   87   88