Page 31 - Introduction to Information Optics
P. 31

1. Entropy Information and Optics




                       H(Z) = H(Y/X) = -    p(c) log 2p(c)dZ.
                                          Jz

       The channel capacity can be determined by maximizing the I(X; Y); that is,


                           C = max —~—,        bits/time.            (1.55)
                               T,p(*)  T

       Under the constraint of the signal mean-square fluctuation that cannot exceed
       a specified value S,


                                     2
                                   \*\  p(a)dX ^ S.                  (1.56)
                                  x
         Since each of the vectors a, b, and c are represented by 2TAv continuous
       variables, and each c l is statistically independent Gaussianly distributed with
       zero mean, and has a variance equal to N 0/2T, we see that

                                               2TAv
                                   2TAv  2TAv
                      I(X; Y} = 7(y4  ; B  ) - £ /(/I,; 5,).         (1.57)

       Thus, from Eq. (1.43), we have

                                H(Z) = 2TAvH(C ;),                   (1.58)
       where



       If we let N = ffc t = N 0Av, then H(Z) can be written as

                                             ineN \
                                                 L
                                  = TAvlog 2 -r \                    (1-59)

       In view of Eq. (1.42), we see that

                               H(B t) ^ Iog 2(27tgffj),              (1.60)

       where the equality holds if and only if /^ is Gaussianly distributed, with zero
       mean and a variance equal to of, . Since b = a + c, />(b) is Gaussianly distrib-
       uted if and only if p(a) is also Gaussianly distributed with zero mean. The
   26   27   28   29   30   31   32   33   34   35   36