Page 17 - Introduction to Information Optics
P. 17
2 1. Entropy Information and Optics
1.1. INFORMATION TRANSMISSION
Although we seem to know the meaning of the word information, fundamen-
tally that may not be the case. In reality, information may be defined as related
to usage. From the viewpoint of mathematic formalism, entropy information is
basically a probabilistic concept. In other words, without probability theory
there would be no entropy information.
An information transmission system can be represented by a block diagram,
as shown in Fig. 1.1. For example, a message represents an information
source which is to be sent by means of a set of written characters that repre-
sent a code. If the set of written characters is recorded on a piece of paper,
the information still cannot be transmitted until the paper is illuminated
by a visible light (the transmitter), which obviously acts as an information
carrier. When light reflected from the written characters arrives at your
eyes (the receiver), a proper decoding (translating) process takes place;
that is, character recognition (decoding) by the user (your mind). This
simple example illustrates that we can see that a suitable encoding process
may not be adequate unless a suitable decoding process also takes place.
For instance, if I show you a foreign newspaper you might not be able to
decode the language, even though the optical channel is assumed to be
perfect (i.e., noiseless). This is because a suitable decoding process requires a
priori knowledge of the encoding scheme; for example, the knowledge of the
foreign characters. Thus the decoding process is also known as recognition
process. Information transmission can be in fact represented by spatial and
temporal information. The preceding example of transmitting written charac-
ters obviously represents a spatial information transmission. On the other
hand, if the written language is transmitted by coded light pulses, then this
COMMUNICATION
CHANNEL
Fig. 1.1. Block diagram of a communication system.