Page 296 - Design for Six Sigma a Roadmap for Product Development
P. 296

266   Chapter Eight


           occurs when the alphabets carry equal probabilities of being chosen. In
           this situation, maximum information is gained by sampling. This
           amount of information reveals the maximum uncertainty that preceded
           the sampling process.

           8.6.3 Shannon entropy
           A function H satisfies the preceding three attributes if and only if it
           has the form
                                                m
                                (,
                              Hp p ,…   , p )  ∑  p log  p             (8.12)
                               b  1  2    m        k   b  k

                                                k 1
           where b 	 1.* If p   0, then p log b   0. The function H is called b-ary
           entropy.
             Shannon entropy is a significant measure of information. When the
           probabilities are small, we are surprised by the event occurring. We are
           uncertain if rare events will happen, and thus their occurrences carry
           considerable amounts of information.† Therefore, we should expect H
           to decrease with an increase in probability. Shannon entropy is the
           expected value of function log b 1/p according to distribution p. It is a
           measure of a discrete information source. Boltzmann entropy, on the
           other hand, may be used in the case of a continuous information source.
           It has an appealing mathematical form that may be considered the con-
           tinuous analog to Shannon’s entropy. Boltzmann information has an
           integral in which p i is replaced with the probability density function,
           pdf f(.). For some pdfs there are no closed-form integrals. One solution
           to this problem would be the use of a digitized version of Boltzmann
           entropy that has the form of Shannon’s entropy. The discrete version is,
           however, an approximate solution (El-Haik 1996).

           8.6.4 Boltzmann entropy
           Boltzmann entropy h(f) of a continuous random variable X with a den-
           sity f(x) is defined as

                    h b (f)      f(x) log b f(x) dx  (if an integral exists)  (8.13)
                              s
           where S is the support set, S   {x/f(x) 0}, of the random variable.
             Example 8.1  The pitch diameter (PD) of a spur gear is a basic design para-
             meter of gear design on which most calculations are based. The pitch circles


             *For b 2(e), H has the units of bits (nats), respectively (1 nat  1.44 bits).
             †Notice the link to outliers in a statistical random sample.
   291   292   293   294   295   296   297   298   299   300   301