Page 19 - Introduction to Information Optics
P. 19
4 1. Entropy Information and Optics
through the communication channel. However, both Wiener and Shannon
share the same basic objective; namely, faithful reproduction of the original
signal.
1.2. ENTROPY INFORMATION
Let us now define the information measure, which is one of the vitally
important aspects in the development of Shannon's information theory. For
simplicity, we consider discrete input and output message ensembles A = {aj
and B = {bj}, respectively, as applied to a communication channel, as shown
in Fig. 1.2, If a £ is an input event as applied to the information channel and bj
is the corresponding transmitted output event, then the information measure
about the received event bj specifies a,-, can be written as
(1.1)
Where P(a i/b i) is the conditional probability of input event a { depends on the
output event b^ P(a t) is the a priori probability of input event a,-, / =1,2,...,
M andj= 1, 2,..., JV.
By the symmetric property of the joint probability, we show that
I(a t; bj) = l(bj; a t). (1.2)
In other words, the amount of information transferred by the output event bj
from a, is the same amount as provided by the input event a, that specified bj,
It is clear that, if the input and output events are statistically independent; that
is, if P(a it bj) = P(a i)P(b j), then /(«,.; bj) = 0
Furthermore, if /(a,•;£>,•) > 0, then P(a i,b j) > P(a i)P(b j), there is a higher
joint probability of a i and bj. However, if I(a i;b i)<Q, then P(a,-,&,-) <
P(a ;)P(i»j), there is lower joint probability of a, and b f.
INFORMATION
CHANNEL
Fig. 1.2. An input-output communication channel.