Page 23 - Introduction to Information Optics
P. 23

8                     1. Entropy Information and Optics

       which implies that the mutual information (the amount of information trans-
       fer) cannot be greater than the entropy information (the amount of information
       provided) at the input or the output ends, whichever comes first. We note that
       if the equality holds for Eq. (1.24) then the channel is noiseless; on the other
       hand, if the equality holds for Eq. (1.25), then the channel is deterministic.
         Since Eq. (1.13) can be written as


                          H(AB) = H(A) + H(B) - I(A; B),             (1.26)

       we see that,


                             I(A; B) = H(A) - H(A/B),                (1.27)
                             I(A; B) = H(B) - H(B/A),                (1.28)

       where H(A/B) represents the amount of information loss (e.g., due to noise) or
       the equivocation of the channel, which is the average amount of information
       needed to specify the noise disturbance in the channel. And H(B/A) is referred
       to as the noise entropy of the channel.
         To conclude this section, we note that the entropy information can be easily
       extended to continuous product space, such as


                                    p(a)log 2p(a)da,                 (1.29)
                                   C

                                    p(b)log 2P(b)db,                 (1.30)
                                   C

                     H(B/A) & -          p(a, b) Iog 2 p(b/a)da db,  (1.31)
                                J — oo J — oc)

                                 /*oo  Too
                     H(A/B) * -          p(a,b)log 2p(a/b)dadb,      (1.32)

       and

                      H(AB) £ - I    I   p(a, b)log 2p(a, b)dadb,    (1.33)


       where the p's are the probability density distributions.
   18   19   20   21   22   23   24   25   26   27   28