Page 294 - Design for Six Sigma a Roadmap for Product Development
P. 294

264   Chapter Eight


           susceptible to analytical and conceptual approaches in the early
           stages and to statistical methods in the validation phases because of
           unanticipated factors called the  noise factors. Organized complexity
           suggests the utilization of a new paradigm for simplification that
           makes use of information and complexity measures. (See Chap. 5.)
             The amount of information generated or needed is a measure of the
           level of complexity involved in the design. In this product design con-
           text, complexity and information are related to the level of obedience
           of the manufacturing operations to the specifications required,
           capability. In addition, the selection of machining processes con-
           tributes to complexity, compatibility. Compatibility of machining to
           specification requirements may be considered as another ingredient of
           complexity. Compatibility is concerned with the related engineering
           and scientific knowledge. The selection of the wrong machine to attain
           a certain DP will increase the complexity encountered in delivering
           the right components. Compatibility is the essence of production and
           manufacturing engineering and material science, both of which are
           beyond the scope of this chapter.
             Complexity in physical entities is related to the information
           required to meet each FR, which, in turn, is a function of the DP that
           it is mapped to. The ability to satisfy an FR is a function of machine
           capability since we can’t always achieve the targeted performance.
           Thus, an FR should be met with the tolerance of the design range
           FR ∈[T   FR]. The amount of complexity encountered in an FR is
           related to the probability of meeting its mapped-to DPs successfully.
           Since probability is related to complexity, we will explore the use of
           entropy information measures as a means of measuring complexity.


           8.6.2 Entropy complexity measures
           Shannon (1948) proved that, in a communication channel, the proba-
           bility of transmission error increases with transmission rate only up to
           the channel capacity. Through his study of random processes, Shannon
           defined entropy as the level of complexity below which the signal can’t
           be compressed. The introduction of the entropy principle was the ori-
           gin of information theory. The original concept of entropy was intro-
           duced in the context of heat theory in the nineteenth century (Carnap
           1977). Clausius used entropy as a measure of the disorganization of a
           system. The first fundamental form was developed by Boltzmann in
           1896 during his work in the theory of ideal gases. He developed a con-
           nection between the macroscopic property of entropy and the micro-
           scopic state of a system. The Boltzmann relation between entropy and
           work is well known, and the concept of entropy is used in thermody-
           namics to supplement its second law.
   289   290   291   292   293   294   295   296   297   298   299