Page 21 - Introduction to Information Optics
P. 21
6 1. Entropy Information and Optics
Similarly, the average amount of self-information provided at the output
end can be written as
I(B) £ -X P(b) Iog 2 P(a) 4 H(B\ (1.9)
B
These two equations are essentially the same form as the entropy equation
in statistical thermodynamics, for which the notations H(A) and H(B) are
frequently used to describe information entropy. As we will see later, indeed
H(A] and H(B) provide a profound relationship between the information
entropy and the physical entropy. It is noted that entropy H, from the
communication theory point of view, is a measure of uncertainty. However,
from the statistical thermodynamic standpoint, H is a measure of disorder.
In addition, we see that
H(A)^Q, (1.10)
where P(a) is always a positive quantity. The equality of Eq. (1.10) holds if
P(a) = 1 or P(d) — 0. Thus, we can conclude that
H(A)^log 2M, (1.11)
where M is the number of input events. The equality holds for equiprobability
input events; that is, P(a) — 1/M. It's trivial to extend the ensemble average to
the conditional entropy:
I(B/ A) = -I £ P(a, b) Iog 2 p(b/a) 4 H(B/a). (1.12)
B A
And the entropy of the product ensemble AB can also be written as
H(AB) = - £ I p(a, b) Iog 2 p(b/a). (1.1 3)
A B
H(AB) = H(A) + H(B/A). (1.14)
H(A/B) = H(B) + H(A/B). (1.15)
In view of \nu ^ u — 1, we also see that
H(B/a)^H(B), (1.16)
H(A/B) < H(A), (1.17)