Page 76 - Introduction to Information Optics
P. 76
Exercises
(a) Evaluate the channel capacity.
(b) Repeat part (a) for a binary channel.
1.3 Show that an information source will provide the maximum amount of
information, if and only if the probability distribution of the ensemble
information is equiprobable.
1.4 Let an input ensemble to a discrete memoryless channel be
A = {a 1,a 2,a 3} with the probability of occurrence p(a l) = |, p(a 2) = {%
p(a 3) = i, and let B = {b l,b 2,b 3} be a set of the output ensemble. If the
transition matrix of the channel is given by
a. Calculate the output entropy H(B).
b. Compute the conditional entropy H(A/B).
1.5 Let us consider a memoryless channel with input ensemble A — a lf
a 2,...,a r and output ensemble B = b 1, b 2,...,b s, and channel matrix
[p(6j/a,.)]. A random decision rule may be formalized by assuming that
if the channel output is bj for every i = 1, 2, . . . , s, the decoder will select
a- with probability q(a i/b j), for every i = 1, 2, . . . , r. Show that for a given
input distribution there is no random decision rule that will provide a
lower probability of error than the ideal observer.
1.6 The product of two discrete memoryless channels C t and C 2 is a channel
the inputs of which are ordered pairs (a,, a}) and the outputs of which are
ordered pairs (b k,h'i) where the first coordinates belong to the alphabet
of Cj and the second coordinates to the alphabet of C 2. If the transition
probability of the product channel is
P(b la it a-) =
determine the capacity of the product channel.
1.7 Develop a maximum likelihood decision and determine the probability of
errors for a discrete memoryless channel, as given by,
P =
where this input ensembles are
p(fl 2 )=i,