Page 25 - Introduction to Information Optics
P. 25
10 1. Entropy Information and Optics
1.3,1. MEMORYLESS DISCRETE CHANNEL
For simplicity, we let an input message to be transmitted to the channel be
a" = aja 2 •••«„,
and the corresponding output message be
where a,- and /?, are any one of the input and output events of A and B,
respectively.
Since the transitional probabilities for a memoryless channel do not depend
on the preceding events, the composite transitional probability can be written
as
1
Thus, the joint probability of the output message p" is
PCS") = I P(oc
where the summation is over the A" product space.
In view of entropy information measure, the average mutual information
between the input and output messages (sequences) of a" and /?" can be written
as
n
H
I(A ; B") = H(B") - H(B /A"l (1.34)
where B" is the output product space, for which H(B") can be written as
n
H(B ) = -£
B"
The conditional entropy H(B"/A") is
n
H(B /A") = -£ £ P(a")P(p>") Iog 2 P(j8"/a-). (1.35)
n
A" B
Since I(A"; B") represents the amount of information provided by then n output
n
events about the given n input events, I(A ; B")/n is the amount of mutual
n
information per event. If the channel is assumed memoryless, I(A"; B )/n is only
a function of P(a") and n. Therefore, the capacity of the channel would be the