Page 20 - Introduction to Information Optics
P. 20

1.2. Entropy Information                 5

         As a result of the conditional probabilities, P(a i/b i] ^ 1 and P^/a,) ^ 1, we
       see that


                                  I(a t; bj) ^ I(a t),               (1.3)
       and
                                  /(a,.; ft,.) ^ I(bj),              (1.4)

       where
                                J( fl£ ) £ -Iog 2 P(a,.),
                                7(5 ;.) 4 -Iog 2 P(^0,

       /(a,) and /(b ;-) are defined as the respective input and output self-information of
       event a (- and event fej. In other words, 7(a f) awrf I(bj) represent the amount of
       information provided at the input and output of the information channel of
       event ^ and event bj, respectively. It follows that the mutual information of
       event a i and event bj is equal to the self-information of event a,- if and only if
       P(a i/b j) = 1, then

                                  /(a,; bj) = /(«,).                 (1.5)

       It is noted that, if Eq. (1.5) is true for all i; that is, the input ensemble, then the
       communication channel is noiseless. However, if P(5 J-/a,) = 1, then

                                  I(a i ;b J )=I(b J ).              (1.6)

       If Eq. (1.6) is true for all the output ensemble, then the information channel is
       deterministic.
         To conclude this section, we note that the information measure as defined
       in the preceding can be easily extended to higher product spaces, such as






         Since the measure of information can be characterized by ensemble average,
       the average amount of information provided at the input end can be written as


                                - £ P(a) Iog 2 P(a) 4 H(A),          (1.8)
                                  A

       where the summation is over the input ensemble A.
   15   16   17   18   19   20   21   22   23   24   25