Page 26 - Introduction to Information Optics
P. 26

1.3. Communication Channel               1 1
                             n
       maximum value of I(A"; B )/n, for a possible probability distribution P(«") and
       length «; that is,

                          C 4 max M-i_51       bits/event.           (1.36)
                              P(a n),n  n
          It is also noted that, if the input events are statistically independent (i.e., from
       a memoryless information source), then the channel capacity of Eq. (1.36) can
       be written as

                                    I(A;B)   bits/event.             (1.37)
                               P(«n)
       We emphasized that evaluation of the channel capacity is by no means simple,
       and it can be quite involved.



       1.3.2, CONTINUOUS CHANNEL

         A channel is said to be continuous if and only if the input and output
       ensembles are represented by continuous Euclidean spaces. For simplicity, we
       restrict our discussion to only the one-dimensional case, although it can be
       easily generalized for a higher dimension.
          Again, we denote by A and B the input and output ensembles, but this time
       A and fl are continuous random variables. It is also noted that a continuous
       channel can be either time discrete or time continuous. We first discuss time-
       discrete channels and then consider time-continuous channels.
          Like a discrete channel, a continuous channel is said to be memoryless if
       and only if its transitional probability density p(b/a) remains the same for all
       successive pairs of input and output events. A memoryless continuous channel
       is said to be disturbed by an additive noise if and only if the transitional
       probability density p(b/a) on the difference between the output and input
       random variables, b — a:

                                   p(b/a) = p(c),                    (1.38)

       where c = b — a.
         Thus, for additive channel noise the conditional entropy H(B/a} can be
       shown:

                         H(B/a) = -|     p(b/a)\og 2p(b/a)db


                                        p(c)log 2p(c)dc.             (1.39)
   21   22   23   24   25   26   27   28   29   30   31