Page 27 - Introduction to Information Optics
P. 27

12                    1. Entropy Information and Optics

       We see that H(B/a) is independent of a, which is similar to the fact that the
       channel is uniform from input. The average conditional entropy is


                        H(B/a) = \    p(a)H(B/a)da


                                       p(c)\og 2p(c)dc=H(B/a).       (1.40)


       In the evaluation of the channel capacity, we would first evaluate the average
       mutual information I(A; B) and then maximize the /(/4; B) under the constraint
       of p(a).
         In view of I(A\ B) = H(B) — H(B/A), we see that if one maximizes H(B),
       then I(A; B) is maximized. However, H(B) cannot be made infinitely large, since
       H(B) is always restricted by certain physical constraints; namely, the available
       power. This power constraint corresponds to the mean-square fluctuation of
       the input signal:

                                          2
                                          a (a)da.

       Without loss of generality, we assume that the average value of the additive
       noise is zero:


                               c =     cp(c)dc = 0.
                                   J - GO

       Then the mean-square fluctuation of the output signal can be written as





       Since b ~ a + c (i.e., signal plus noise), one can show that

                                        2
                                             2
                                  ffl = ff  + a ,                    (1.41)
       where

                                          2
                                         c p(c)dc.

       From the preceding equation, we see that setting an upper limit to the mean-
       square fluctuation of the input signal is equivalent to setting an upper limit to
   22   23   24   25   26   27   28   29   30   31   32