Page 75 - Introduction to Information Optics
P. 75

1. Entropy Information and Optics





       1.1 N. Wiener, Cybernetics, MIT Press, Cambridge, Mass., 1948.
       1.2 N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series.
       1.3 C. E. Shannon, 1948, "A Mathematical Theory of Communication," Bell Syst. Tech. J., vol.
          27, 379-423, 623-656.
       1.4 C. E. Shannon and W. Weaver, The Mathematical Theory of Communication, University of
          Illinois Press, Urbana, 1949.
       1.5 W. Davenport and W. Root, Random Signals and Noise, McGraw-Hill, New York, 1958.
       1.6 M. Loeve, Probability Theory, 3rd ed., Van Nostrand, Princeton, N.J., 1963.
       3.7 D. Gabor, 1946, "Theory of Communication," J. Inst. Electr. Eng., vol. 93, 429.
       1.8 D. Gabor. 1950, "Communication Theory and Physics", Phi., Mag., vol. 41, no. 7, 1161.
       1.9 J. Neyman and E.S. Pearson, 1928, "On the Use and Interpretation of Certain Test Criteria
          for Purposes of Statistical Inference", Biometrika, vol. 20A, 175, 263.
       1.10 L. Brillouin, Science and Information Theory, 2nd ed.. Academic, New York, 1962.
       1.11 F. W. Sear, l^hermodynamics, the Kinetic Theory of Gases, and Statistical Mechanics,
          Addison-Wesley, Reading, Mass., 1953.
       1.12 F. T. S. Yu and X. Yang, Introduction to Optical Engineering, Cambridge University Press,
          Cambridge, UK, 1997.




       EXERCISES


       1.1 A picture is indeed worth more than a thousand words. For simplicity, we
           assume that an old-fashioned monochrome Internet screen has a 500 x 600
           pixel-array and each pixel element is capable of providing eight distinguish-
           able brightness levels. If we assume the selection of each information
           pixel-element is equiprobable, then calculate the amount of information that
           can be provided by the Internet screen. On the other hand, a broadcaster has
           a pool of 10,000 words; if he randomly picked 1000 words from this pool,
           calculate the amount of information provided by these words.
       1.2 Given an n-array symmetric channel, its transition probability matrix is
           written by

                             l-p     P      P
                                   n — 1 n — 1       >1- 1
                              P             P
                                    1 1  P n
                             n- 1        n-1         n~^T
                      [P]


                              P      P      P
                             n- 1 n- 1    H - 1      l-p
   70   71   72   73   74   75   76   77   78   79   80