Page 22 - Introduction to Information Optics
P. 22

1.2. Entropy Information                 7

       where the equalities hold if and only if a and b are statistically independent,
         Let us now turn to the definition of average mutual information. We consider
       first the conditional average mutual information:

                                                 b.                  1,18


       Although the mutual information between input event and output event can
       be negative, /(a; b) < 0, the average conditional mutual information can never
       be negative:

                                    I(A-b)^0,                        (1.19)

       with the equality holding if and only if events A are statistically independent of
       b; that is, p(a/b) = p(a), for all a.
         By taking the ensemble average of Eq. (1.19), the average mutual informa-
       tion can be defined as

                               I(a;B)±%p(b)I(A'b).                   (1.20)
                                       B
       Equation (1.20) can be written as






                                        p( fl;fe)/( fl;b).           (1.21)
                                    A B

       Again we see that

                                    I(A;B)^0.                        (1.22)

       The equality holds if and only if a and b are statistically independent.
       Moreover, from the symmetric property of /(a; b), we see that

                                 I(A;B) = I(B;A).                    (1.23)

       In view of Eqs. (1.3) and (1.4), we also see that

                              I(A; B) ^ H(A) = l(A),                 (1.24)
                               I(A; B) ^ H(B) = l(B\                 (1.25)
   17   18   19   20   21   22   23   24   25   26   27